Showing posts with label Robots. Show all posts
Showing posts with label Robots. Show all posts

2/25/2014

Oscar-winning Flying-Cam system takes to the skies with QNX technology

Flying-Cam has been at
the forefront of unmanned
aerial filming since 1988.
Ever wonder how film crews manage to achieve death-defying camera angles that take your breath away? Well, wonder no more, because I am about to show you one of the most advanced tools of the trade. It's called SARAH, it runs on the QNX OS, and it recently won a Scientific and Technical Award from the Academy of Motion Picture Arts and Sciences for its contribution to movie making.

The SARAH unmanned aerial system is the brainchild of Flying-Cam, a company founded in 1988 by Emmanuel Prévinaire, who, in 1979, developed the first unmanned close-range aerial camera for motion pictures. SARAH represents the latest generation of Flying-Cam technology and has been in service since 2012 — yet its credits already include Skyfall, Oblivion, Prisoners, Smurfs II, and Mr. Go.

The Flying-Cam SARAH unmanned aerial system in action, filming a scene for Mr. Go. 

So why did the folks at Flying-Cam choose the QNX OS? Several factors contributed to the decision, including flexible architecture, predictable response times, and advanced profiling tools. To quote Tony Postiau, head of aerial robotics engineering at Flying-Cam, "we have been thoroughly impressed with the QNX OS. It works extremely well on our hardware and uses system resources efficiently, leaving most of the hardware processing power available to our application — a crucial attribute that we looked for.”

To find out more about QNX and the Flying-Cam SARAH system, check out the press release that QNX issued this morning.

And for a look at SARAH in action, here's a promotional video that demonstrates how it helps film crews capture angles that would be impossible for full-size helicopters, cable systems, or other traditional camera support devices:



4/20/2012

Bone researchers gear up with Lego Mindstorms

I'll admit it: I own a Meccano set. And despite my rapidly advancing years, I have no intention to give it up. Pretty sad, right?

Mind you, I've always thought it would be cool to build something useful with it — which is probably my way of rationalizing why I keep the damn thing.

That said, I've just stumbled on a video that shows how a building toy can, in fact, help create something useful — something that may ultimately aid humanity. The toy in this case isn't Meccano, but its 21st century equivalent: Lego MindStorms. Check it out:



Maybe I'll hold on to my Meccano set just a little bit longer. Or maybe I should get with the program, pick up Lego Mindstorms, and start, well, programming. :-)
 

12/12/2011

Meet a true "Hiro" of robotic research

If developing next-gen robotics is your thing, Hiro's your man.

A couple of years ago, I introduced you to Hiro, a QNX-based robot designed for research and teaching programs in university labs. Even if you didn't read about Hiro here, he may still seem familiar, what with his appearances on Engadget, übergizmo, and other über-popular sites.

Kawada Industries, the company that created Hiro, describes him as a starter set for research into humanoid robots. To that end, Hiro comes equipped with a stereo vision camera, speech recognition, hands with 15 degrees of freedom, hand-mounted cameras, and a repeat positioning accuracy of less than 20 micrometers — that's 20 one-thousandths of a millimeter.

Since my last post, Kawada has uploaded some videos to demonstrate Hiro's chops. For instance, here's a clip showing how he has all the right moves:



And here's a clip showing how he can listen to voice commands:



If Hiro's role is to serve as a platform for next-gen robotics, he is succeeding. Recently, Osamu Hasegawa, a professor at the Tokyo Insitute of Technology, used Hiro as the basis for a new "thinking" robot. The robot, also dubbed Hiro (confusing, I know), employs a Self-Organizing Incremental Neural Network algorithm to adapt to its environment and learn new tasks.

For instance, in this video, a researcher asks Hiro to pour a cup of water. Hiro has never done this before, but figures out how to do it. That's some algorithm!



For more information on Hiro and his manufacturer, Kawada industries, click here.
 

6/21/2011

Taking QNX to the moon

Imagine driving a vehicle over rugged, unforgiving terrain filled with humongous rocks and craters.

I know, it sounds like a blast!

Now imagine if you had to control the vehicle via remote control.

Well, that could still be fun.

Now imagine if the signals from your remote control took 1.5 seconds to reach the vehicle, and if you had to wait another 1.5 seconds to see how the vehicle responds to your commands.

Hm, that could be a challenge.

In fact, overcoming this delay is just one of many challenges facing the 30 companies and development teams contending for Google Lunar X prize.

The mandate of the Lunar X prize is simple: Send a rover to the Moon; drive it for at least 500 meters; and transmit video, images, and data back to the Earth. The devil, of course, is in the details.

First of all, landing on the moon and beaming back videos only gets you the base prize of $20 million. To earn the full $30 million, your lunar rover has to drive at least 5000 meters, survive two weeks of minus 182 degrees C, and take a photo of the Apollo landing site. (Presumably, the photo would put to rest rumors that NASA filmed the moon landing in Neil Armstrong’s basement, but don’t count on it.)

Mind you, all this assumes the rover arrives safely on the moon in the first place. For that to happen, the lander module carrying the rover must travel more than 400,000 kilometers at maximum speeds of more than 10 kilomoters/sec. It must then decelerate by more than 2.5 kilometers/sec and fly a few hundred meters above the lunar surface until till it finds a nice cushy landing spot.

Meet Asimov, the latest incarnation
of the Part-Time Scientists' lunar rover.
So who the heck would take on such a challenge? The Part-Time Scientists, that’s who.

The Part-Time Scientists team is the first Google Lunar X PRIZE participant based primarily (though by no means completely) in Germany. They are also considered among the top 5 most likely teams to succeed.

I plan to post several articles on the team and their progress over the coming months, but in the meantime, consider this:

It isn’t 1969 anymore — The unmanned systems competing for the X PRIZE will need to pack a lot more software intelligence than the Apollo spacecraft. To handle the many real-time tasks on their lander and rover, the PTS team has chosen the QNX operating system.

These are guys are a blast — Most members of the PTS team have real jobs. They work on this project in their spare time, out of sheer love and enthusiasm. In fact, when they do publicity around their project, children represent a large portion of their target audience. They clearly want the next generation of budding scientists to share the same passion for engineering and space exploration that they themselves have. Good, that.

Stay tuned for subsequent posts, where will dig deeper into the role that QNX plays in this exceedingly cool project.

And did I mention? You can follow the Part-Time Scientists on Twitter and on Facebook.
 

5/16/2011

QNX-powered MABEL robot makes front cover of IEEE Control Systems magazine

A couple of years ago, I regaled you with the story of MABEL, a robot designed to study bipedal locomotion. Among other things, MABEL aims to achieve a better compromise in "speed, stability, agility, and energy efficiency" than other bipedal-robot designs. (I'm quoting Jessie Grizzle, a professor of electrical engineering at the University of Michigan and one of the masterminds of the MABEL project.)

To acquire data from sensors, compute control actions, and output commands to actuators, MABEL uses a realtime computing and data acquisition environment based on the QNX Neutrino RTOS. The software framework for the control system is based on RHexLib, which was originally developed for RHex, another QNX-based robot.

If you're in the mood for a deep dive on MABEL, check out the April 2011 edition of IEEE Control Systems magazine — the article runs an impressive 25 pages. If, however, you'd like something a little lighter, I suggest the three following videos:

MABEL walking on an uneven surface


MABEL walking like a human, with "heel-strike and toe-off"


MABEL, before and after correction of a hardware bug

 

7/21/2010

QNX-based instrument tests haptic systems for bionic limbs

For decades, science fiction writers have speculated on what will happen to humanity once robots become sufficiently intelligent and sufficiently easy to mass-produce. The scenarios are endless: robots replacing humans, robots killing humans, robots entertaining humans, robots protecting humans, and last but not least, robots becoming human.

Meanwhile, back in the real world, many researchers and engineers are focusing on robotic technology that can assist humans. These include intelligent prostheses, such as bionic hands or arms, and "rehab robots" that help stroke patients re-learn to walk.

Designing a robotic system that can assist humans is one thing; testing to see whether it is accomplishing its goals is another.

Enter Kinea Design, a UK-based firm that specializes in human interactive mechatronics, including bionic hands. To test haptic "tactors" that let bionic devices provide a sense of touch, Kinea created the Greenbox, a test instrument based on the QNX Neutrino RTOS. This instrument can calibrate load cells, check closed-loop responses of actuators, interface with a variety of sensors, and perform a host of other tasks necessary for testing and verification.

Testing and refining these haptic tactors is critical: By providing a sense of touch, the tactors free amputees from having to rely solely on visual input when manipulating objects. Moreover, they enable amputees to sense vibration, surface texture, friction, and other useful environmental cues.

I'm only scratching the surface of Kinea's technology. For a in-depth article on their testing methods and philosophy, including the Greenbox, click here. For more information on their products, click here.

For other examples of how QNX technology enables robotics and robotics research, click here.
 

7/13/2009

Meet Mabel, the QNX-based bipedal robot

Hey, have you ever noticed how researchers like to give their robots personal names? I've written about three QNX-based humanoid robots in this blog, and none of them has a technical appellation like RU-138B. Rather, they're called Cog, Kismet, and Hiro.

I'm not sure why that is. Maybe it's because humans simply like to name things. Or maybe it's because giving a humanoid robot a personal name makes it less scary and more, well, human.

Whatever the reason, I've just come across another QNX robot, and guess what: it's called Mabel. Mabel isn't humanoid, but she is bipedal. More specifically, she is a “new platform for the study of bipedal locomotion in robots.”

A joint initiative between the U of Michigan and the Robotics Institute of Carnegie Mellon, Mabel has three stated purposes: to explore a new powertrain design with the objective of improving power efficiency; to encourage development of new feedback control algorithms for running on level surfaces and walking on rough terrain; and to help promote science and technology to younger students.

QNX in control

To acquire data from sensors, compute control actions, and output commands to actuators, Mabel uses a QNX-based realtime computing and data acquisition (DAQ) environment. The software framework for the control system implementation is based on RHexLib, which was originally developed for RHex, another QNX-based robot.

To learn more about Mabel, check out the technical paper, "MABEL: A New Robotic Bipedal Walker and Runner."

BTW, the root for Mabel is amabilis, a Latin word for lovable. Kind of supports the name-your-robot-to-make-it-less-creepy theory, doesn't it? :-)

5/14/2009

Meet Hiro, the QNX-controlled humanoid robot

Got US $77,000 burning a hole in your pocket? If so, you can now buy your very own QNX-controlled humanoid robot.

Or perhaps not. The robot, dubbed HIRO, is designed primarily for research and teaching programs in college and university labs. So chances are, you don’t want to bring this guy home. Unless, of course, programming robots to clean dishes and do the laundry is your thing.

Kidding aside, HIRO is a pretty serious piece of robotics, with a total of 15 degrees of freedom, stereo vision camera, two robotic hands, two hand-mounted cameras, and a repeat positioning accuracy of less than 20 micrometers. It also has a serious purpose, allowing researchers to study how robots work in real-world environments.

If you still want one, but don't have the cash, relax: You have buy the starter version for $57,000. Mind you, for that money, you could buy a 2009 BMW 320i. Decision, decisions...

HIRO was co-developed by Kawada Industries and General Robotix (GRX). For more information, check out the articles published by Tech-On and Engadget.

10/20/2008

QNX drives seven tons of armor-plated attitude

I recently came across an article on the Crusher, a 7-ton, QNX-based autonomous vehicle that can haul a payload of 8000 pounds. Developed by the National Robotics Engineering Center (NREC) at Carnegie Mellon University, the Crusher is part of the U.S. military's Unmanned Ground Combat Vehicle Perceptor Integration (UPI) program. (Is it just me, or does that acronym have a few letters missing?)

The U.S. military has plans for the Crusher, such as performing reconnaissance in hostile areas and hauling supplies over rough terrain. They may also equip the Crusher with automatic weapons, which makes me wonder what Isacc Asimov would think of this beast.

I searched for the Crusher on YouTube and, sure enough, found a bunch of videos. I've included three: The first focuses on the sheer power and agility of the Crusher. The second shows how the Crusher will stop and "think" to determine the best way to cross a ditch, climb a hill, or negotiate an obstacle. And the third shows, among other things, how to control the Crusher with an XBox controller.

Here's the first video:



Here's the second; I recommend fast forwarding to the 1:00 mark:
[POSTSCRIPT: This video was removed from YouTube after this blog was posted.]

And here's the third:



To read more about the Crusher, click here.

7/24/2008

6/18/2008

Weather too hot? Try this cool QNX app

I was snowblowing a friend’s driveway last winter when, suddenly, a plume of green stuff began to spew out of the snowblower's chute. I stopped, looked down, and was mortified to discover that the snowblower had just shredded my friend’s juniper bush.

The fact is, snowblowers aren’t precision instruments. Just ask the folks who operate the massive, 20-ton snowblowers that clear major roads and highways after winter storms. All too often, these vehicles bang into guardrails, resulting in costly, and often dangerous, guardrail repairs. (Imagine replacing a guardrail that prevents drivers from falling over a cliff and you’ll quickly realize where the “dangerous” part comes in.)

Enter the Advanced Rotary Plow (ARP). The brainchild of the California PATH program, the ARP automatically steers itself away from guardrails by following magnets embedded in the pavement. Controlled by the QNX RTOS, the ARP is one of the first real-world applications to come out of the PATH/CALTRANS research in automated vehicle control.

To create the ARP, the folks at PATH outfitted an existing snowblower with several add-ons, including:
  • a six-slot industrial computer

  • sensors for measuring steering angle and vehicle movements

  • sensors to measure the field strength of the road magnets

  • a steering wheel actuator
The industrial computer hosts several realtime processes, all running on the QNX RTOS. The processes obtain signals from the various sensors, control the steering actuator, and display information to the vehicle’s driver.

For details on the the ARP, including the positive outcomes of initial field trials, click here.

As for the juniper, it's healing nicely.

6/11/2008

This week's random hits

Check back every Friday for more random hits.

5/29/2008

This week's random hits

Check back every Friday for more random hits.

5/26/2008

Today's focus: The world's biggest robotic telescope

January 7, 1610 — Galileo Galilei peers through a small, homemade telescope and spies three moons orbiting the planet Jupiter. The discovery throws a wrench into the prevailing belief that everything in the universe orbits the Earth.

May 28, 2008 — Astronomers in Yunnan Province, China, peer through a 40-ton, $4.5 million robotic telescope that has the power to view galaxies more than 5 billion light years away. Their discoveries, no doubt, will throw a wrench into prevailing theories of how the universe works.

Some things, thankfully, never change.

Situated 3,240 meters above sea level, the QNX-controlled Yunnan telescope is the largest optical telescope in China. Designed for multiple applications, it helps astronomers search for planets, analyze supernovae, study the age of the universe, and investigate a variety of other stellar phenomena — this is one system where the sky really is the limit.

Telescope Technologies Limited (TTL), a firm based in Birkenhead, England, designed and built the Yunnan telescope. The firm's design requirements were nothing if not ambitious. The telescope had to:

  • support a variety of scientific instruments — up to 7 at a time
  • simplify maintenance through a modular design and through off-the-shelf software and hardware components
  • support multiple modes of operation, including remote control over the Web and fully autonomous robotic operation
  • satisfy the needs of a variety of observatories, astronomical programs, and user communities
  • simplify operation to minimize the number of human operators
  •  
To control the telescope’s many functions, TTL built a distributed system that comprises 7 embedded PCs running the QNX Neutrino RTOS. A separate PC controls each motion axis (azimuth, altitude, rotator) of the telescope; the remaining PCs handle mirror-cell pneumatics, data logging, security, and other auxiliary functions.

Ease of use was critical to the system design, which provides a number of graphical displays for monitoring and control. Take, for example, the autoguider GUI (below), which allows the operator to manually control the autoguider camera for calibration, acquisition, and other purposes.


The autoguider GUI

For a closer look at this telescope’s many capabilities, check out check out TTL’s product specification.
 

5/12/2008

Using robots to address a serious kneed

No doubt about it, people in developed countries are getting older and older. Case in point: The number of knee replacements in the U.S. grew from 257,000 in 1998 to 455,000 in 2004. Since doctors rarely perform knee replacements on patients under 50, those numbers can reflect only one thing: an aging and progressively nonambulatory population.

Hip-replacement surgeries follow a similar trend. For evidence, consider the following chart, which aggregates knee and hip replacement surgeries performed in Ontario, Canada from 1992 to 2002. (If you're wondering, the dips reflect seasonal fluctuations. Not surprisingly, doctors perform fewer surgeries during their summer vacations and Christmas holidays.)


Source: BMC Health Services


These trends haven't escaped the notice of researchers at Waseda University, the Japanese equivalent of MIT. By creating QNX-based robots that walk like humans, the researchers hope to gain insights that will help older people walk longer. The bipedal robots act as human motion simulators, allowing researchers to generate quantitive data that can't easily (or safely) be measured with human subjects. Ultimately, the researchers believe this data will aid in the development of new healthcare technologies and therapies.

As someone whose knees have been wonky since 1987, I’m looking forward to their results. :-)

For technical details on the QNX-based WABIAN-2R robot used in this research, click here.

4/21/2008

Six systems for celebrating Earth day

Global warming? I’m still not convinced. Biofuels? Pure marketing hype. Organic foods? Healthy, especially for the companies who make money selling them.

Don't get me wrong. I care deeply about the environment and I believe that we should do more — a lot more — to make it better. But, sometimes, I think people would rather embrace fashionable solutions than do the right thing. Like consume less.

But, hey, enough of my skepticism. It’s Earth day, so let’s celebrate! To get things going, let me tell you about some QNX-based systems that are making the world a little bit greener:
  • Building automation system helps Boeing conserve electricity — This system, designed by Tridium, allows a large Boeing plant (over 1 million square feet) to slash power consumption by 20% during peak periods. More.


  • Automated sensor system prevents soil pollution — This system monitors the insulating foil in a waste landfill site, ensuring that dangerous substances don’t seep into the surrounding soil. More.


  • Traffic control system optimizes traffic flow — This system from Delcan minimizes traffic jams and shortens waits at intersections. Which means less gas is wasted going nowhere. More.


  • Autonomous underwater vehicles (AUVs) monitor underwater pollution — Pollution monitoring is just one of the talents of these Russian-designed AUVs. More.


  • High-speed simulation system speeds fuel-cell research — Fuel-cell hybrid vehicles (FCHV) hold the promise of cleaner air. This system speeds up the simulation of FCHVs based on proton exchange membranes. More.


  • Monitoring system helps factories reduce emissions — This system, designed by Bailey Controls, dates back to the early 1990s. You won’t find much about it on the Web, but if you have old copies of QNXnews lying around, look for Volume 6, Number 2 (1992) and turn page 5.

How about you? Have you or your company designed a system that uses electricity or fuel efficiently? That minimizes pollution? That prevents environmental disasters? I’d love to hear about it!

4/17/2008

Well, king me: It's a checkers-playing robot!

QNX-based systems perform eye surgery, control air traffic, monitor nuclear power plants, and keep 9-1-1 systems running 24/7. Heck, they even control touchless car washes. But did you know that QNX can also play a wicked game of checkers? Check out the video here. (Hint: Skip the intro and fast-forward to the 00:40 mark.)

I don't know about you, but any robotics project that can prove its point using beer caps and bubble gum has got my vote!

For the record...

If you really want to have a million people do something, don't ask them to speak Latin. It is enough to ask them to just speak English without using cuss words...”

- Wen-mei Hwu, a senior researcher in parallel programming at the University of Illinois, on resisting the temptation to create new programming languages for multi-core processors.

Random hits...

A QNX-based system for genome analysis; your forest on drugs; explaining cosmetic surgery to children.

4/03/2008

Balancing act

Want to see a cool example of real-time control? Check out this video of a QNX-based system controlling a Furuta pendulum:



This isn't just a parlor trick. Furuta pendulums help engineering students learn the principles of controlling dynamic or unstable systems, such as walking robots. In this case, the pendulum was part of a student project at the Norwegian University of Science and Technology.

Remotely related trivia: Josef Hofmann, a piano virtuoso and inventor who lived from 1876 to 1957, is credited with inventing windshield wipers for cars. Story has it, he got the idea by watching a metronome — which, of course, is a pendulum — swinging back and forth. (Metronomes, thankfully, went digital in the 1980s. The older mechanical types had a tendency to beat out syncopated rhythms when placed on anything less than a perfectly flat surface.)

2/04/2008

Getting a leg up on robotic design

I was tempted to buy a Roomba robovac last week, until my wife warned me of the pandemonium that would ensue when our schnauzers got a hold of it. So while I can’t entertain you with a cute video of my dogs attacking a robotic vacuum cleaner, how about the next best thing: a giant robotic leg powered by QNX.

This week, Carnegie Mellon’s student newspaper posted a story on Jonathan Hurst, a grad student who plans to develop a six-legged robot that can walk, run, and even climb stairs. Already, Hurst has designed an innovative leg that uses fiberglass springs to emulate natural running movements — a departure from the rotating gear motors and pneumatic actuators of traditional robots.

For his project, Hurst is using software from RHex, a QNX-based robot that is the “first documented autonomous legged machine to have exhibited general mobility... over general terrain.” RHex also serves as the basis for AQUA, a project dedicated to building amphibious robots that work underwater. Click here to see the RHex and AQUA robots in action.

If you search the web, you’ll find lots of other robotic projects based on QNX, including the famous Cog and Kismet robots from MIT. In most cases, these projects use QNX because of its realtime capabilities. However, the researchers at MIT needed to solve an additional problem: implementing efficient interprocess communications among Cog’s 32 processors. They found the solution in QNX transparent distributed processing (TDP), which allows an application to access remote software and hardware resources without special software coding. Recently, QNX published the source code for TDP on Foundry27.