Thứ Tư, 1 tháng 1, 2014

Servo magazine 12 2007

12.2007
VOL. 5 NO. 12
SERVO 12.2007
5
ENTER WITH CAUTION!
22
The Combat Zone
31
Votrax SC-01 to
SpeakJet Translator
by Robert Doerr
Break the language barrior with your
HERO robot.
36
GPS
by Michael Simpson
Part 3: Parse positional data from the
NEMA protocol.
43
Spare the Rod .
Spoil the Bot
by Karla Conn
Rewards and punishments can serve
as fundamental motivations for
your robot to learn by.
46
Programming by
Demonstrating Robots
Task Primitives
by Alexander Skoglung and
Boyko Lliev
Using imitation to teach robots
isn’t as straightforward as you’d
think, but it can be done.
51
Using FRAM for
Non-Volatile Storage
by Fred Eady
If EEPROM densities are too small
for your robotic application and
you don’t want to design in a
hard drive or battery-backed SRAM,
then FRAM is your answer.
Features & Projects
PAGE 14
TOC Dec07.qxd 11/5/2007 4:08 PM Page 5
Published Monthly By
T & L Publications, Inc.
430 Princeland Court
Corona, CA 92879-1300
(951) 371-8497
FAX (951) 371-3052
Product Order Line 1-800-783-4624
www.servomagazine.com
Subscriptions
Inside US 1-877-525-2539
Outside US 1-818-487-4545
P.O. Box 15277
North Hollywood, CA 91615
PUBLISHER
Larry Lemieux
publisher@servomagazine.com
ASSOCIATE PUBLISHER/
VP OF SALES/MARKETING
Robin Lemieux
display@servomagazine.com
EDITOR
Bryan Bergeron
techedit-servo@yahoo.com
CONTRIBUTING EDITORS
Jeff Eckert Tom Carroll
Gordon McComb David Geer
Pete Miles R. Steven Rainwater
Michael Simpson Kevin Berry
Fred Eady Robert Doerr
Alexander Skoglund Boyko Lliev
Karla Conn Dan Albert
James Baker Chad New
Paul Ventimiglia James Isom
CIRCULATION DIRECTOR
Tracy Kerley
subscribe@servomagazine.com
MARKETING COORDINATOR
WEBSTORE
Brian Kirkpatrick
sales@servomagazine.com
WEB CONTENT
Michael Kaudze
website@servomagazine.com
PRODUCTION/GRAPHICS
Shannon Lemieux
Michele Durant
ADMINISTRATIVE ASSISTANT
Debbie Stauffacher
Copyright 2007 by
T & L Publications, Inc.
All Rights Reserved
All advertising is subject to publisher’s approval.
We are not responsible for mistakes, misprints,
or typographical errors. SERVO Magazine
assumes no responsibility for the availability or
condition of advertised items or for the honesty
of the advertiser.The publisher makes no claims
for the legality of any item advertised in SERVO.
This is the sole responsibility of the advertiser.
Advertisers and their agencies agree to
indemnify and protect the publisher from any
and all claims, action, or expense arising from
advertising placed in SERVO. Please send all
editorial correspondence, UPS, overnight mail,
and artwork to: 430 Princeland Court,
Corona, CA 92879.
True Autonomy
When roboticists talk of
autonomy, it’s generally understood
that this elusive goal will be achieved
through advances in computational
methods, such as artificial intelligence
algorithms, more powerful processors,
and increasingly powerful and
affordable sensors. However, achieving
truly autonomous robots will require
more than simple computational
evolution. It’s a misnomer to call
a robot that can navigate a
room without human assistance
‘autonomous’ when the duration of
autonomy is limited to perhaps a half
hour because of battery life. Other
than simplistic stimulus-response
BEAM robots (see Figure 1), the Mars
rovers are perhaps the best examples
of computationally and energetic
autonomous robots. However, even
the rovers are controlled remotely by
scientists at NASA.
The advances in battery
technology, fuel cells, and power
management chips haven’t kept pace
with computational advances in
energy management, such as behavior
modification. Unfortunately, behavior
tactics such as resting, altering speed
or path to reflect remaining energy
stores, and shutting down unnecessary
sensors can only go so far in extending
the operating time of a robot. New
sources of energy must be identified
and perfected.
Although there is ample
commercial pressure to develop higher
capacity energy sources and more
effective energy management devices,
there are also significant incentives
from the military. According to the
DOD, soldiers of the near future are
expected to be assisted by electronic
devices ranging from audio, video,
and data communications equipment,
night vision gear, and wearable
computers, to exoskeletons. And these
devices will require an unprecedented
amount of portable power.
In response to this need, the
Department of Defense Research and
Mind / Iron
by Bryan Bergeron, Editor

Mind/Iron Continued
6
SERVO 12.2007
FIGURE 1. Solar powered
light-seeking BEAM robot.
Mind-Feed Dec07.qxd 11/5/2007 4:34 PM Page 6
Engineering Wearable Power Prize is offering $1M for the
first place winner for the best wearable electric power
system prototype. The competition — which is open to
individual US citizens 21 or older — will be held in the fall of
2008. The grand prize goes to the developer of the
technically superior power vest that weighs 4 kg or less,
operates continuously for four days, and provides 20W
average and 200W peak. See www.dod.mil/ddre/
prize/topic.html#7 for details on the competition.
Even if you don't take part in the competition, consider
the energy autonomy of your next robot design. While you
probably don't have access to Sterling isotope thermal
generators or other esoteric energy sources available to military
robotics designers, there are numerous promising technologies
that you are free to explore. One that I've followed for several
years is illustrated by the predatory robot EcoBot II, developed
by the University of the West of England in Bristol.
The EcoBot II uses a microbial fuel cell to generate
electricity from flies. Bacteria in the microbial fuel cells
metabolize sugars in the flies, releasing electrons in the
process. The robot isn't yet up to the capabilities of the
Mr. Fusion Home Energy Reactor-equipped De Lorean
featured in “Back to the Future” — top speed is 10
centimeters per hour. However, the EcoBot II can travel
for five days on just eight flies. If you have an aversion
to flies and other decaying organic matter, you can try
your hand at extending the basic BEAM robots,
available from several vendors featured in SERVO.
SV
Dear SERVO:
In reference to the September ‘07 Robytes . Holy cow! $69
million for an RC airplane? Wow, where can I sign up? I think as a
tax payer I should feel screwed! Who am I? I used to fly RC planes
before I became a pilot. I’ve built a four seat airplane, and been
president of an EAA (experimental aircraft association) chapter. I
know a bit about what airplanes are, and what they cost.
One of the members of our EAA chapter built a Lancair 4,
which would be a 300 mile per hour airplane. He went top shelf
on it, and spent about $400,000 on it. Sure, it only has half the
payload of the MQ-9 (1,550 lbs), but it
seems like for not a lot more, one could
build it bigger, and get the payload.
Looking at an Epic Dynasty, it has
3,300 lbs payload, and is priced under $2
million; it’s capable of 340 knots. The
specs might be misleading with the empty
and max takeoff weights but that is with
an interior, and equipment for people.
Strip all that out and you can have a UAV.
Basically, the remote control is
some extra wiring to the auto pilot
servos. I am to believe that is worth 50
some million dollars?
So, maybe someone might say I am
comparing “toy” airplanes to some
commercial aircraft. How about a Boeing
737? Well, right from Boeing, ready to fly,
they list at $49 million. I guess a $20
million conversion would be reasonable
(probably not). But this aircraft is capable
of hauling over 30,000 lbs (about 10X the
MQ-9). It can also cruise at over 500 mph.
I am very sad to hear the way things are going in the
UAV market.
People claim the UAVs are supposed to be cheaper and
safer, but it still takes a crew of two to fly this MQ-9, where an
F-35A lightning II will only cost about $50 million and takes a
crew of one. It’s capable of carrying 18,000 lbs and flying past
mach 1 in a stealth mode carrying smart weapons. This
manned aircraft is clearly a more useful aircraft.
Tom Brusehaver
Dallas,TX
SERVO 12.2007
7
Resources
• EcoBot II — Self-sustaining killer robot creates a stink.
New Scientist
, September 9, 2004.
www.newscientist.com/article.ns?id=dn6366
• EcoBot II in action. www.youtube.com/watch?v=1Nuw654pFbU
• BEAM Robots. www.solarbotics.net; www.solarbotics.com;
www.geocities.com/SouthBeach/6897/beam2.html
• How Fuel Cells Work
.
How Stuff Works
.
www.howstuffworks.com/fuel-cell.htm
Mind-Feed Dec07.qxd 11/5/2007 4:35 PM Page 7
8
SERVO 12.2007
Fooling Virtual Robots
A highly abstract but interesting
concept has emerged from the University
College London (www.ucl.ac.uk),
where Dr. Beau Lotto and other
researchers have been experimenting
with “virtual robots” to understand why
humans can be fooled by visual illusions.
Some folks at the UCL Institute of
Ophthalmology trained artificial neural
networks (essentially, virtual toy robots
with tiny virtual brains) to “see” correctly
(i.e., as we do). They trained the virtual
critters to predict surface reflectance in a
variety of 3D scenes such as found in
nature. When the bots examined a range
of grey scale illusions, they often made
the same mistakes that humans do.
Among the study’s conclusions is
that “it is likely that illusions must be
experienced by all visual animals regard-
less of their particular neural machin-
ery.” For details and some entertaining
illusions, visit www.lottolab.org.
Concept Car Includes
Companion Bot
At the latest Tokyo Motor Show,
Nissan (www.nissanusa.com) unveiled
the Pivo 2 electric concept car, evolved
from the original three-seater that first
appeared in 2005. It is mechanically as
strange as it looks, given that the wheels
(each of which is powered by its own
motor) can turn up to 90°, and the cabin
can rotate 360°, so you can drive it for-
ward, sideways, or backward and never
need a reverse gear. It’s powered by lithi-
um-ion batteries and uses “by-wire” con-
trol technologies rather than mechanical
systems for braking and steering.
But possibly the strangest feature
is the “Robotic Agent” that rides with
you everywhere you go. It’s basically
a bobbling head, located near the
steering wheel, that communicates
with you in either English or Japanese.
Aimed at making “every journey less
stressful,” the Agent speaks in a “cute
electronic voice” and provides a link to
everything from basic vehicle functions
to searching for a parking spot.
According to Nissan, the head can
sense the driver’s mood by analyzing
facial expressions (it has digital eyes
and a microphone) and deliver prepro-
grammed phrases that might include
“Relax, don’t worry,” “You’ve dripped
Big Mac sauce into your lap,” and “Put
away that gun.” At this point, the car is
fully functional but — alas — is still too
expensive for the commercial market.
Fortune Teller in a Bowl
Also too expensive for the com-
mercial market but there anyway, is
the Swami Conversational Robot, avail-
able from Neiman Marcus (www.nei
manmarcus.com). This goes way
beyond the old mechatronic gypsy for-
tune teller machines of penny arcade
fame, although, peeping out from his
glass dome, he does bear some resem-
blance to Zoltar. Under the control of
a laptop running special AI software,
this guy generates facial expressions
using some 30 micromotors and can
watch you via eye-mounted cameras.
Apparently, you can teach him to
recognize family members, have
meaningful conversations with you,
and answer questions intelligently.
That’s probably more than the afore-
mentioned family members can do,
but the catch is that this thing costs
more than my first house: $75,000.
Give ‘em the Bird for
Christmas
On a level that will allow it to fit
your Christmas budget is Squawkers
In this image, it appears that the dark
stripes on top are darker than the
white stripes on the front of the
object. But a mask placed over the
image reveals that the “white” stripes
in the foreground are exactly the
same as the “grey” ones on top.
Thanks to Beau Lotto/UCL.
Nissan’s Pivo 2 concept car. Photo
courtesy of Nissan Motor Company.
The Swami Conversational Robot.
Photo courtesy of Neiman Marcus.
by Jeff Eckert
Robytes
Robytes.qxd 11/1/2007 11:16 AM Page 8
McCaw, recommended for children over
5 years and very lonely people of all
ages. Widely available on the Internet
for about $55, it talks, squawks, and is
nearly as annoying as a real parrot. He
can repeat any words spoken to him,
give appropriate responses to prepro-
grammed commands, and learn new
responses. Put him in dance mode, and
he will sashay to whatever music you
play or even provide his own music.
In terms of mechanics, Squawkers
can move his head, flap his wings, eat a
cracker, and even give you a smooch
when you touch his beak. Probably the
best feature is that he goes to sleep
when his eyes are covered or the room
gets dark. You can see him at www.has
bro.com or in your local toy store.
Robot Plays the Theremin
As most readers will already know,
the theremin — invented by Leon
Theremin in 1919 — is one of the earliest
completely electronic musical instruments
and the first to require no physical contact
with the “musician.” As far as I can verify,
it was played only by human beings until
about 2003, when Ranjit Bhatnagar built
Lev specifically for that purpose.
Lev, the product of a floor lamp,
some metallic junk, and a few micro-
processors, has been a solo act since
then but is now accompanied by a few
“thumpbots,” which provide a rhythmic
background to the theremin’s notorious-
ly unappealing sound. If you’re curious,
a video of the band playing a tune that
is said to be Gnarls Barkley’s “Crazy”
(but sounds more like belly dance music)
can be viewed at www.youtube.com/
watch?v=19RJEnNUg1I.
Mini Chopper Fights Fires
Most unmanned surveillance seems
to be performed by fixed-wing aircraft
these days, but the West Midlands Fire
Service, over in Birmingham, U.K., is
trying out a small chopper, which it has
dubbed the Incident Support Imaging
System (ISIS). The device doesn’t
actually put out fires, but it does provide
live video from above the incident
scene and aids firefighters in planning
an emergency response.
Such incidents can also include
general rescue operations, inspection
of water supplies and gas cylinders,
and so on. ISIS is actually a modified
MD4-200 vertical takeoff and landing
(VTOL) micro aerial vehicle (MAV) built
by Microdrones GmbH (www.micro
drones.com) over in Germany.
The composite shell provides lower
weight and EMI shielding and houses
instruments that can include a GPS,
accelerometers, gyroscopes, a magne-
tometer, a still or video camera, and pres-
sure, temperature, and humidity sensors.
The unit weighs only about 2 lbs (900 g)
and carries up to nearly 0.5 lbs (200 g).
Depending on the payload, the four
battery-powered rotors can keep it aloft
for up to 20 min. In spite of the $60,000
price tag, Microdrones has sold 250 of
them 16 months after their introduction.
Biped Bot Responds to
PS2 Controller
Closer to home, Dallas-based
KumoTek (www.kumotek.com) is a
builder of custom and standard bots for
education, research, entertainment, and
some industrial applications. (Kumo, in
case you were wondering, is Japanese for
“spider.”) The news there is the introduc-
tion of the model KT-X, billed as the first
low-cost bipedal root platform that can
be controlled via a wireless PS2 controller.
The 13-in, 2.9-lb robot can walk,
run, do somersaults, and stand up from
a face-up or face-down position. KT-X
has 17 degrees of freedom, is driven by
a 60 MHz HV processor, and comes with
75+ preprogrammed motions. As of this
writing, the unit is still under develop-
ment, but it should be commercially
available “within a few months.”
SV
Robytes
Squawkers McCaw, the latest in
the Furreal Friends lineup.
Photo courtesy of Hasbro.
Lev the musical robot now performs
with “thumpbot” friends. Shown with
a Moog Etherwave instrument. Photo
courtesy of www.moonmilk.com
A special version of the MD4-200
is being evaluated for fire and
rescue operations. Photo courtesy
of Microdrones GmbH.
SERVO 12.2007
9
The new KT-X.
Robytes.qxd 11/1/2007 11:17 AM Page 9
10
SERVO 12.2007
T
he competition is sponsored by
the Office of Naval Research
(ONR), as well as by AUVSI,
according to a Robotics@Maryland aca-
demic paper, “Tortuga: Autonomous
Underwater Vehicle,” authored by
several club members and advisors.
The competition “tasks” each
robot with six challenges:
• Maintain a straight course and head-
ing through the starting gate.
• Locate the flashing “start” buoy.
• Ram that buoy “to free it.”
• Locate the first “orange pipeline
segment.”
• Follow the orange pipeline until it
meets a second flashing buoy, which it
must also ram.
• Follow two more pipelines, locate a
sonar beacon, and follow it to the
“treasure octagon.”
Team members based the robot’s
design and construction on the best
possible completion of these tasks.
Tortuga Design and
Construction
A serviceable aluminum chassis
surrounds and supports Tortuga’s
mechanics, as well as an 18.5” long by
8” diameter clear acrylic tube, which
houses the watertight components.
The team members selected the chassis
design for ease of access to the robot’s
functional parts, electronics, and other
“innards” and attachments.
The robot uses an inertial
navigation system (INS) to establish its
location and maintain its heading.
The system is comprised of sensors,
processors, and software. These enable
the vehicle to establish and change
location by adjusting its velocity.
The INS includes the following
hardware and software:
1) Three magnetometers (to measure
the Earth’s magnetic field).
2) Three gyroscopes (to measure angu-
lar acceleration).
3) Three accelerometers (to measure
Contact the author at geercom@alltel.net
by David Geer
Tortuga — From Isle of
Pirates to Underwater Spy
The Isle of Tortuga, Haiti — once a haven for pirates — lives on as the namesake for
the University of Maryland Robotics Club’s
submer
sible
competition robot.
Tortuga — the Club’s entry in the Association for Unmanned Vehicles and Systems
International’s (AUVSI’s) annual Autonomous Underwater Vehicle (AUV)
competition — first appeared in the yearly event in Autumn 2007.
Tortuga was the first robot that the
University of Maryland entered into the
Association for Unmanned Vehicles
and Systems International’s (AUVSI’s)
annual Autonomous Underwater
Vehicle (AUV) competition, according
to Scott Watson, a University of
Maryland student and Robotics Club
member. This is a close-up, aft (tail,
stern) angle view of Tortuga.
The AUV is equipped with four
Seabotix thrusters (three of four are
visible) to control depth, pitch, yaw,
and horizontal translation, according to students who crafted the submersible robot. Roll
is statically stabilized with a careful distribution of foam, small weights, and putting heavy
electronics (such as the batteries) at the bottom of the pressure hull, Watson notes.
The AUV uses a MacMini to interface with all its sensors and motor controllers
through USB ports.
Photos are courtesy of Scott Watson,
University of Maryland student and
Robotics Club member.
Geerhead.qxd 11/4/2007 6:32 PM Page 10
GEERHEAD
linear acceleration).
4) An inertial measurement unit (IMU)
houses the aforementioned nine
sensors.
5) Closed-loop controller software to
process force vector equations.
The combination of sensors and
sensor data are relied on for navigation
because GPS signals don’t travel
underwater.
Attaining Objectives
To get through the starting gate
properly, Tortuga uses a combination
of position confirmations from its
forward camera and output from a
nonlinear adaptive controller.
A nonlinear adaptive controller
takes sensor data as input and uses it
to calculate the orientation (location,
position) of the robot and how that is
changing, according to Scott Watson,
University of Maryland student and
Robotics Club member.
“It does some calculations and
then determines how best to use the
actuators available (thrusters, in our
case) to do something desirable, like
maintain heading, depth, pitch, roll,
and velocity,” explains Watson.
The nonlinear aspect means that
the controller can take the many differ-
ent forces acting on the robot into
account, according to Watson. If the
team could guarantee that only one
force contributed to the robot moving
up and down in the water and, similar-
ly, that only one thruster was able to
affect that up and down motion, then
the robot would only need a linear
controller, explains Watson.
“But, in nature,” Watson says,
“forces tend to constructively and
destructively interfere with each
other in a way that may not be deter-
minable from the available sensors.”
The adaptive aspect means
the controller knows that the input
(parameters) it receives from the
sensors isn’t necessarily 100 per-
cent accurate and that it is permit-
ted to intelligently adjust those
parameters, by use of its pro-
gramming, according to Watson.
“For example, it’s impossible to
measure buoyancy or roll moments per-
fectly, but an adaptive controller will, in
a sense, learn how to adjust these
parameters to more successfully control
the vehicle by depending on sensor
measurements,” illustrates Watson.
Next, we have buoy ramming.
Buoy ramming sounds like fun
and, in this instance, it is a carefully cal-
culated maneuver. The buoy is a flash-
ing light housed in a watertight enclo-
sure. The robot’s task is to locate this
buoy and run directly into it to knock it
loose from its mooring, according to
Watson. “This demonstrates vehicle
control, valid image processing, and
SERVO 12.2007
11
This University of Maryland student and Robotics Club member Matt Bakalar is check-
ing for air bubbles that might emanate
from the watertight enclosure that
protects the AUV’s electronics.
Devastating leaks can come
from the o-ring seals, as well as the
wet-matable connectors drilled into
the aluminum end caps of the pres-
sure hull. If all goes well, the lead
controller programmer will secure
shell (SSH, a form of connection
interface) into the MacMini to begin
testing the robot’s stability under
active control, according to Watson.
University of Maryland
student and Robotics Club
member Stepan Moskovchenko
submerges the watertight pres-
sure hull to watch for air bubbles
and water accumulation beneath
the electronics and batteries.
“The first leak in the lifetime
of the robot was discovered
minutes earlier due to user error
with the homemade underwater
FireWire connector,” says Watson.
The straps hold aluminum
CNC’d end caps with piston style
o-ring seals in place on an 8” diameter acrylic tube, Watson explains.
Three student team members check whether the inertial measurement unit
(IMU) is level within the vehicle. While hanging from the team tent at the competi-
tion in San Diego, the students
attempt to calibrate the internal
magnetometer and tweak gains
in the controller code.
“The team uses a
MEMSense Nano IMU with
Micro-Electro-Mechanical
Systems (MEMS) technology.
This affords a relatively low cost
and lightweight solution for
inertial measurements and to
track the course of the robot,”
says Watson.
Geerhead.qxd 11/4/2007 6:33 PM Page 11
12
SERVO 12.2007
artificial intelligence,” explains Watson.
The robot employs two Unibrain
Fire-I cameras for object recognition.
These cameras stream video via
FireWire connection to the MacMini
(1.83 GHz dual core, 2 GB RAM),
which is the robot’s onboard computer.
Image processing algorithms on the
MacMini, written in C++, use the
OpenCV image processing library
to identify competition objects
like the buoy (and, of course, the
orange pipelines it must follow),
according to a Robotics@Maryland
Tortuga academic paper.
The artificial intelligence comes
from the robot’s “higher level autonomy
software” in the robot’s hardware brain.
A gigabit Ethernet tether stretches the
distance between Tortuga’s onboard
MacMini computer and a com-
puter on dry land. “We usually
communicate with the onboard
computer over a shell session, that is,
over the Linux console,” says Watson.
This is especially useful during testing.
To aid the robot in recognizing and
following the pipelines, the team uses
color filters to bring out the orange,
according to Watson. “Then we run an
edge detection algorithm that gives us
a collection of points that belong to
edges in the image. Finally, we feed
these points into another algorithm
called a Hough transform, which picks
out straight lines from those edge
points,” Watson continues.
“Marker dropping” is another task
in the AUVSI competition. In this case,
the robot drops six inch by half inch red
PVC pipe sections into target boxes as
markers at two points in the competi-
tion. A weight in the PVC makes sure it
drops, according to club members and
students.
Team members mount these PVC
pipe sections inside Tortuga’s deploy-
ment tubes, which are fitted with
permanent and electromagnets to hold
and deploy the markers. When the robot
energizes the electromagnet, it cancels
the permanent magnet’s magnetic field,
releasing the marker over its target.
The team mounted the marker
tubes next to the ventral video camera
in order to minimize positioning error.
The ventral camera is the one on
Tortuga’s belly, specifically designated
to watch for targets and for the orange
pipelines, according to Watson.
The robot uses sound to help it
locate its “treasure” in the final task of
the competition. A sonar, seated
beneath the octagonal treasure target,
creates the sounds. A three sensor
hydrophone array on the robot’s side
senses these underwater sounds like a
single microphone. A series of micro-
controllers and analog filters determine
the frequency and time of arrival of the
sounds to pinpoint the location of the
sonar, according to Watson.
System Support
A microcontroller network offloads
low-level tasks from the MacMini and
supports the robot. “For example,
GEERHEAD
UM students and Robotics Club
members Stepan Moskovchenko [left]
and Joe Gland [right] inspect the
thruster and camera housing cables for
damage after a competition-qualifying
run that knocked the camera housing
loose.
The external frame, made of
80/20 tubing, performed one of its
design functions by protecting all the
electronics and cabling during the
“jolt.” A little bit of rope and the team
is ready to go straight back to testing
code to get the robot back in the
water for another run, Watson
exclaims!
UM students and Robotics Club members take a moment to pose behind the
Autonomous Underwater Vehicle (AUV) they designed and built — in nine months — for
the Association for Unmanned
Vehicles and Systems International
annual competition.
The Maryland students fin-
ished 13th out of a field of 27 teams
in this their first year, winning a
$500 prize. “They are proud of their
accomplishment and look forward
to spending more time developing
the artificial intelligence code and
refining sensor systems to better
compete with more experienced
teams in 2008,” Watson says.
Robotics Club member Nathan
Davidge waits at Reagan National
Airport with the team’s AUV robot.
All the electronics and parts for
the AUV fit in the travel case on the
seat to the right of Nathan. “Even at
the airport, the student team was
working on integrating a new binary
protocol for more reliable communi-
cation to the motor controllers from
MacMini,” says Watson.
Geerhead.qxd 11/4/2007 6:33 PM Page 12
collecting hundreds of voltage measurements from a sensor,
averaging them together, and performing small calculations
that the main computer can ask for without worrying about
the electrical details of how it was done” is an optimization
of the architecture, as Watson explains.
A sensor PCB contains most of the microcontrollers
and they have a parallel bus (8 bits wide) that coordinates
information flow and job instructions.
Conclusion
AUVSI held this year’s competition July 11-15 at the
Space and Naval Warfare Systems Center TRANSDEC Facility
in San Diego, CA. The University of Maryland expects to see
Tortuga or its ‘offspring’ competing again next year.
SV
Department of Electrical and Computer Engineering,
A. James Clark School of Engineering, University of Maryland
www.ece.umd.edu
Robotics@Maryland Club —
http://ram.umd.edu/trac
Replacement thrusters —
www.seabotix.com
AUVSI
— www.auvsi.org
RESOURCES
SERVO 12.2007
13
GEERHEAD
HE8EJIFH;<;H>?J;9)0'HE8EJIFH;<;H>?J;9)0'


7KHUHVXOWVRIDQLQIRUPDOSROOWDNHQUHFHQWO\DWWKH)LUVW
$QQXDO:RUOG'RPLQDWLRQ6\PSRVLXPDUHQRZLQ5RERWV
SUHIHU+LWHFVHUYRVRYHURWKHUVHUYREUDQGV7KH\NQRZWKHZLGH

VHOHFWLRQRI+LWHFDQDORJDQGGLJLWDOVHUYRVSURYLGHWKHPZLWKWKHSRZHUDQGGHSHQG
DELOLW\
QHHGHGWRHYHQWXDOO\WDNHRYHUWKH:RUOG0DNH\RXUURERWKDSS\XVH+LWHFVHUYRV
3DLQH6WUHHW_3RZD\_&DOLIRUQLD___ZZZKLWHFUFGFRP
>_j[Y
Ej^[h8hWdZ
.EW2OBOT3ERVOS.EW2OBOT3ERVOS
+656*
6SHHGVHF
7RUTXHR]LQ
6WHHO*HDUV
+656*
6SHHGVH
F
7RUTXHR]LQ
6WHHO*HDUV
+657*
6SHHGVHF
7RUTXHR]LQ
7LWDQLXP
$//63(&,),&$7,216$792/76
Geerhead.qxd 11/4/2007 6:34 PM Page 13
14
SERVO 12.2007
Q
. D
o
you know of any
humanoid robot kits that
cost less than a $1,000? I like
the ROBONOVA and KHR-1 body
designs with all of the motors and
flexibility, but it costs way too much
money for me. I was wondering if you
happened to know of any cheaper
robots out there.
— Andy Kerns
A
. W
hen it comes to fully articulated
humanoid robots, the ROBONOVA
(www.robonova.com) and the
Kondo KHR-2HV (www.kondo-robot.
com or visit www.trossenrobotics.
com) can be purchased for around
$1,000. The Kondo KHR-2HV is the next
generation of the KHR-1 and is a little
less expensive than the KHR-1.
Since humanoid robots are becom-
ing more popular,
there are new robot
designs coming out
each year. A couple
that I am aware of are
the I-Sobot (www.iso
botrobot.com) which
costs around $300 and
the RoboPhilo (www.
robophilo.com) which
costs
about $500. I
don’t have any personal
experience with either
of these two robots,
but from what I can see
from the videos on their websites, they
are very impressive. The I-Sobot is
currently available from several places,
such as Amazon (www.amazon.com).
The RoboPhilo kit should be available by
December 2007. Table 1 shows a few
basic specifications for these two robots.
Another option to consider is the
BRAT from Lynxmotion (www.lynx
motion.com) which costs less than $300
for the basic kit. This is a very basic
bipedal robot kit that has a total of six
servos (three for each leg). It requires
assembly and a connection with a PC
to control the robot. If you add your
own electronics and develop your own
walking routines, the BRAT can become
autonomous.
For those people that want a
challenging project, the BRAT is an
inexpensive route to get started. All of
the parts on the BRAT are interchange-
able and expandable, so at a later time,
the BRAT can be reconfigured with
some additional parts to make a 17 or
19 degree of freedom robot.
On the subject of reconfigurable
robot kits, you might want to take a look
at look at the Bioloid (www.tribotix.
com) robotics kit. This is a very good
general-purpose robot kit which allows
you to build many different types of
robots, such as dogs, spiders, six-servo
walkers like the Lynxmotion BRAT, and
even the big 17+ servo humanoid robots.
The Bioloid robots use the Dynamixel
servos, which are some of the most
advanced robotics servos on the market.
To be able to build a humanoid
Tap into the sum of
all human knowledge
and get your questions answered here!
From software algorithms to material selection, Mr. Roboto strives to meet you
where you are — and what more would you expect from a complex service droid?
by
Pete Miles
Our resident expert on all things
robotic is merely an Email away.
roboto@servomagazine.com
Figure 1. I-Sobot.
Figure 2. RoboPhilo.
Specification
I-Sobot RoboPhilo
Height 6.5 inches 13 inches
Weight 12 oz. 38 oz.
Servos (degrees
of freedom)
17 20
Power 3 AAA NiMH 6V NiMH
Remote Control Infrared Infrared
Special Features
Built-in Gyro, Voice Recognition,
Speaker, Pre-programmed
Motions, Programmable
Pre-programmed
Motions,
Programmable
Approximate Costs $299
~
$500
Table 1. I-Sobot and RoboPhilo Humanoid Robot Specifications.
MrRoboto.qxd 11/5/2007 3:58 PM Page 14

Xem chi tiết: Servo magazine 12 2007


Không có nhận xét nào:

Đăng nhận xét