Subscribe to RSS feed

ROS Meets Precision Agriculture at Blue River Technology

Blueriver lettuce thinner

If you were to design the worst possible environment for software engineering, the cramped jump seat of a John Deere tractor would be a contender. The sound and vibration of the engine makes conversation and concentration difficult. If the sun isn't making it impossible to see the monitor, the blowing dust is.

This is a common scenario at Blue River Technology because the company is in the agriculture business. Blue River combines computer vision and robotics to deliver precision thinning to lettuce growers.

Blueriver lettuce thinner

Blue River has been using ROS since late 2012. According to Willy Pell, Blue River's Sr. Systems Engineer: "We love ROS because it makes it easy to find and correct errors in the worst possible circumstances. Any time something is wrong we know within a few dozen lines of code where the problem is presenting itself. It allows us to build systems The UNIX Way. In other words, we make simple, open source programs that communicate well with other programs." 

Blue River makes machines called lettuce thinners. Lettuce growers plant too many seeds because only 80% of seeds actually turn into plants. Since a lettuce head needs 10 inches on either side to get the resources it needs, growers must then thin the field of excess lettuce. Blue River's machine is pulled behind a tractor and takes pictures of the plant seedlings. It identifies the ones to keep and the ones to kill and toggles a sprayer to render its verdict. There is finality to this machine. If it messes up it doesn’t just waste time, it impacts the grower’s yield.

Added Pell, "ROS has been a fantastic tool for us. I love how you can gut one node and not have it affect the rest of the system. I love how you can break the system apart and test subcomponents. Being able to confidently refactor, test and debug large parts of the system allows us to evolve extremely quickly."

Blueriver lettuce thinner

It never ceases to amaze and delight us when we learn of new and innovative uses of ROS. Just recently, ROS celebrated a celestial milestone when it arrived at the International Space Station as part of Robonaut 2. While Blue River's solution is certainly more terrestrial it is no less innovative and impactful. Being able to deliver a precision agricultural solution to farmers means higher yield and fewer chemicals.

Because of the permissive open source license of ROS, we aren't always aware of who is using ROS and for what purposes. In this case, we are very grateful to the team at Blue River for sharing their story with us.

If you are using ROS and have a story to share, please drop us a line at info@osrfoundation.org.

New robot hand, with OSRF electronics and ROS support

The RightHand Robotics ReFlex hand.

This week at ICRA in Hong Kong, RightHand Robotics is announcing their new ReFlex hand. Built on over a decade of research in the Harvard Biorobotics Lab and the Yale Grab Lab, it leverages the best insights the team gained from the DARPA Autonomous Robotic Manipulation (ARM) program. The hand provides three mechanically intelligent underactuated fingers, highly sensitive tactile feedback, a solid electrical interface designed by OSRF, and (naturally) a ROS API!

If you're at ICRA, find the RightHand team to see a live demo. Otherwise, here's a video:

Meet your (ROS-based) cleaning team

We learned recently from the folks at Avidbots that they're developing ROS-based commercial cleaning robots. Here's their story:

Billions of square feet of commercial floor space are cleaned nightly in the US. Avidbots automates the most time-intensive tasks of retail and storehouse cleaning: sweeping and scrubbing of floors. Powered by ROS, these robots automatically clean floors in grocery stores, airports, and malls, enabling cleaning staff to concentrate on higher value tasks such as window cleaning, dusting, and polishing. The end result? Staff who are better-paid and more productive -- a clean win for everyone.

While developing these robots, Avidbots must iterate rapidly through designs. Two key facilitators of this fast development cycle are ROS and Gazebo. ROS’s communication system promotes a simple modular design, while Gazebo provides for accurate simulation testing. Modular software design and thorough simulation testing enable Avidbots to achieve its rapid development goals. This strategic use of ROS and Gazebo is significantly accelerating Avidbots’ entry into the robotic services space.

Avidbots prototypes

Brian Gerkey at Solid 2014

Our own Brian Gerkey is speaking today at Solid 2014 in San Francisco. In his talk, Brian will introduce the DARPA Robotics Challenge (DRC), present in detail the cloud-hosted Virtual Robotics Challenge component of the DRC, and discuss opportunities for the resulting open source simulation software to be an ongoing platform for robotics research, education, design, and product development.

ROS in the cockpit

There's a new DARPA-funded effort to develop software and hardware to assist pilots in various kinds of aircraft. The program is called ALIAS, for Aircrew Labor In-Cockpit Automation System (apparently unrelated to that other Alias).

What does this have to do with us? Well, we learned last week that, in advance of the full ALIAS program, DARPA funded a pilot project (so to speak), in which Moiré Incorporated demonstrated the kind of thing they're imagining by retrofitting a King Air cockpit simulator with a stereo camera, text-to-speech, speech-to-text, and a Kinova MICO robot arm. They showed how those sorts of tools could help by having the robot pitch in during a simulation of a very bad day for the plane, complete with engine fires and cabin depressurization.

And, naturally, Moiré built the whole system with ROS.

There's a cool video demonstrating this system; it will be available after the ALIAS proposal period closes on July 14, 2014.

Gzweb for Mobile Platforms

Gzweb Mobile from OSRF on Vimeo.

During her Gnome Outreach Program for Women internship with OSRF, Louise Poubel made Gzweb work on mobile platforms by designing a mobile-friendly interface and implementing lighter graphics. Until recently, Gazebo was only accessible on the desktop. Gzweb, Gazebo's web client, allows visualization of simulations in a web browser.

Louise implemented the graphics using WebGL. The interface includes menus suitable for mobile devices and multi-touch interactions to navigate the 3D scene. Louise conducted usability tests throughout the development phase in order to improve user experience and quickly discover and resolve bugs.

To optimize 3D rendering performance on mobile platforms, she also implemented a mesh simplifcation tool which allows users to choose how much to simplify 3D models in the database during the deployment stage and generate coarse versions of meshes to be used by gzweb.

Mobile devices have been, and will continue to be, a big part of our lives. With Gzweb Mobile, users can visualize simulations on mobile phones and tablets and interact with the scene, inserting shapes and moving models around.

References:
http://www.gazebosim.org
Gzweb wiki

Repositories:
Gzweb Bitbucket repository

OSRF at FIRST competition in St Louis

The FIRST logo

Our own Nate Koenig is at the FIRST regional competition today and tomorrow in St. Louis. The kids are enjoying trying out the Gazebo simulation of the FIRST competition arena, which includes a forklift-equipped robot and some objects to interact with:

Gazebo simulation of FIRST competition arena
Students trying Gazebo simulation of FIRST competition arena

This simulation environment is being beta-tested with a few teams now. We hope that Gazebo will be available to all FIRST teams next year.

And of course our immersive virtual reality demo that combines Gazebo with the Oculus Rift headset and the Razer Hydra controller remains a hit:

Students trying Gazebo immmersive virtual realty demo

CloudSim-Ed: A Robotics MOOC Prototype

The OPW logo

CloudSim-Ed from OSRF on Vimeo.

During her Gnome Outreach Program for Women internship with OSRF, Ana Marian Pedro worked on CloudSim-Ed a prototype for a massive open online robotics course built with Google CourseBuilder. The course offers simulation tasks and challenges created with CloudSim, Gazebo and ROS.

To enroll in a course, a student must have a Google account and basic CloudSim credentials. CloudSim simulators are controlled from a custom module in CourseBuilder to launch simulation challenges and retrieve the score. When a challenge is launched, the Gazebo simulation world is viewed through a web interface, while an IPython notebook provides the means to interact with the simulated robot using ROS commands.

This project intends to provide students with a means to learn robotics using open source software. For schools with limited robotics laboratory space and equipment, the simulated worlds and environments give students a chance to experiment with minimal setup time and effort. This will hopefully reduce the time used in software troubleshooting when dealing with actual robot hardware.

References:
http://www.ros.org
http://www.gazebosim.org
http://cloudsim.io
https://code.google.com/p/course-builder/

Repositories:
https://bitbucket.org/ammpedro/cloudsim-ed-actuation
https://bitbucket.org/ammpedro/cloudsim-ed-web
https://sunlit-vortex-449.appspot.com/course

Happy National Robotics Week!

All of us at OSRF would like to wish everyone a happy National Robotics Week! There are events celebrating all things robotic all week long throughout the U.S. A full list of events can be found here. The folks at RoboWeek 2014 have even created some cool robot trading cards. Come and get 'em here. (At least two of them are running ROS.)

Closer to home, OSRF will be taking part in the Silicon Valley Robot Block Party. We will be among the many cool robotics companies showing off their wares. The event is this Wednesday, April 9, from 1:00 to 4:00 pm at WilmerHale in Palo Alto. Robot Block Party is a free event open to the public, so go ahead and spread the word.

And for those of you who can't make it Wednesday, or simply can't get enough of OSRF and ROS, please look for us on Thursday at Xconomy's Robo Madness 2014. Hosted by Xconomy's Wade Roush, Robo Madness takes place at SRI International from 1:00 to 5:40 pm so get yourself signed up right away. Our own Brian Gerkey will be on stage at 1:25 discussing ROS, and then again at 5:20 on a wrap-up panel discussion moderated by John Markoff of The New York Times.

HERE mapping cars run ROS

As reported at HERE Three Sixty, their global fleet of hundreds of mapping cars is running ROS!

HERE car

They carry laser range-finders, cameras, and GPS that are used to estimate the vehicle's posisiton and gather 3-D pictures of the surrounding environment. That data gets shipped back to their headquarters for processing.

As HERE's Michael Prados put it, "The system of sensors and computers means the software that's needed is very like that which is used to create robots." So they decided to build their cars' software on ROS. The software runs on a headless server in the car's interior, with the driver interacting via a mobile application on a tablet that he or she can operate easily from the seat.

HERE car interior

"We chose the open source ROS because it was the best solution, hands-down," Michael concludes. "And now we're looking into the ways that we might give back to OSRF, and help its future success."

Read the whole story at HERE Three Sixty.

Latest Tweets

Follow Us