Another year, another collection of creative and sometimes topical costumes. Happy Halloween, everybody!
Goodbye, ROSCon 2016
Following an exciting weekend, we bid farewell to ROSCon 2016 in Seoul. It was record-breaking in every way, with over 450 attendees and a 60% increase over last year in sponsorship. Thanks to everyone for coming and for your support! Stay tuned for details on the next event. We anticipate posting videos of the presentations by October 20.
Photo credit: Evan Ackerman
Bosch Research and Technology Center Joins Forces with Open Source Robotics Foundation to Advance the Development of ROS
Bosch Underwrites Full-Time Developer for ROS 2 Research
A full-time developer sponsored by Bosch’s Research and Technology Center in North America will begin working with Open Source Robotics Foundation (OSRF) this month to advance the development of Robotics Open Source 2, the two organizations announced today. This announcement not only demonstrates the mutual commitment of Bosch and OSRF to the development of ROS 2 but also to the worldwide community of ROS developers and supporters.
ROS 2 is the next generation of ROS, a set of libraries and tools that simplify the task of creating and programming robotic platforms and applications. An update on ROS 2 will be made by Deanna Hood and William Woodall at ROSCon, the upcoming developer conference October 8-9 in Seoul, Korea.
“The development of ROS has been a collaborative effort from the beginning, and we are thrilled to continue that tradition with the support of Bosch,” said Brian Gerkey, CEO of Open Source Robotics Foundation. “We look forward to welcoming our newest ROS 2 team member.”
“We are honored to be a part of the movement that’s helping developers to create groundbreaking robotics technologies for industries all over the world,” said Axel Wendt, group manager for Robotics at the Research and Technology Center in Palo Alto. “This partnership delivers on our objective to cultivate and to drive innovation from the ground up.”
“The collaboration with OSRF is well aligned with our worldwide efforts in robotics research at Bosch,” said Kai Arras, head of robotics research at Robert Bosch GmbH. “We are glad to contribute back to the open source community in this way and look forward to new, exciting features of ROS 2 that are relevant to the industry.”
We’re collaborating with the Toyota Research Institute
You can see more formal announcements from us here and from Toyota Research Institute here, but we are very pleased to announce our new relationship with TRI. TRI is the R&D arm of Toyota. TRI has contracted with us to help develop and grow ROS and Gazebo. In addition, TRI has made a $1 million donation to OSRF.
The CEO of TRI, Gill Pratt, is very familiar with us and our work, including our contributions to the DARPA Robotics Challenge, for which he was the program manager.
As Gill says in our release:
I’ve witnessed first-hand the value of the Open Source Robotics Foundation. Of the twenty-three teams that competed in the DARPA Robotics Challenge eighteen used ROS and fourteen used Gazebo. Through this charitable contribution, TRI will support efforts to grow the capabilities of ROS and Gazebo, not only for TRI, but also for the hundreds of thousands of members of the open source robotics community.
Part of today’s announcement also includes the news that we have created the Open Source Robotics Corporation, a for-profit subsidiary of OSRF. We will continue to create and distribute open source and free-of-charge applications for the robotics community, including ROS and Gazebo. If you have specific questions about our reorganization, please let us know.
HAPTIX: Simulation of prosthetic devices
Fundamentally, robotics is about helping people. Robots help us manufacture things, help us build things, and help make our lives easier and more convenient. As robotic systems increase in sophistication and capability, they’re starting to help people more directly, in elder care, rehabilitation centers, and hospitals. In the near future, robotics will become even more tightly integrated with humanity, to the point where cybernetics will be able to restore function to people with disabilities. In particular, amputee military personnel are the focus of one such program.
In 2014, DARPA announced its Hand Proprioception and Touch Interfaces (HAPTIX) program, which “seeks to create a prosthetic hand system that moves and provides sensation like a natural hand.”
According to Doug Weber, DARPA program manager of HAPTIX: “We believe that HAPTIX will create a sensory experience so rich and vibrant that the user will want to wear his or her prosthesis full-time and accept it as a natural extension of the body. If we can achieve that, DARPA is even closer to fulfilling its commitment to help restore full and natural functionality to wounded service members.”
Three different teams are involved in the HAPTIX project, and its success will depend on a carefully optimized mix of hardware, user interfaces, and control algorithms. OSRF is proud to be providing a customized version of the Gazebo simulator to the HAPTIX teams, allowing them to run tests on their software without being constrained by hardware availability: essentially, a kind of virtual playground for software engineers.
“The goal of HAPTIX is for OSRF to provide a realistic prosthetic simulation environment for biomechanical engineers to develop controllers for advanced prosthetics with high degrees of freedom,” explains John Hsu, co-founder and Chief Scientist at OSRF. The advanced prosthesis that DARPA is using in the HAPTIX program is DEKA’s “Luke” robotic arm, a 14 DoF cybernetic total arm replacement system. However, the arm is currently controlled by simple user interfaces designed for testing, and part of what HAPTIX hopes to deliver are interfaces that utilize control signals from muscles and nerves, while simultaneously delivering sensory feedback.
After nearly ten years of work and $40 million from DARPA, DEKA’s robotic arm is an amazing piece of hardware, but that’s just the beginning. “The hardware, in my opinion, needs to come before the software,” says Hsu. “They can be designed at the same time, but the hardware has a longer iteration cycle. Once you develop a nice hardware platform that’s stable, then you give it to the software team, and they take off, working really fast on the software while in the meantime trying not to break the hardware.”
This illustrates two reasons why having a good simulation environment is important: first, it lets you start working on the software before the hardware is fully complete, and second, it to some extent insulates software development from the hardware itself, meaning that you can have lots of engineers developing software in parallel, even if you only have one piece of hardware that may be fragile, expensive, and quite often inoperable for one reason or another.
For OSRF, creating and supporting a version of Gazebo for the HAPTIX program involves many different areas. Besides the customized simulation environment, OSRF has also provided teams with an OptiTrack motion capture device, NVIDIA stereo glasses and a 3D monitor, a 3D joystick, and the documentation required to get it all working together flawlessly. This custom version of Gazebo also includes support for a variety of teleoperation hardware, and for the first time, users can interact programmatically with the simulation using both Windows and MATLAB. HAPTIX developers can leverage these 3D sensors and teleoperation systems to translate the motions of physical arms and hands into virtual environments, allowing them to run common hand function tests in the real world and in simulation at the same time. This also lays the foundation for a framework that could provide amputees a powerful and affordable way to learn how to use their new prosthesis.
Once the HAPTIX teams receive their DEKA arms, OSRF’s job becomes even more important, according to Hsu, because they’ll get a chance to see how well the simulation is actually working and then refine it to bring it as close to reality as possible. “I’m really looking forward to the validation part,” Hsu says. “I think that’s one of the big missing pieces for many simulation platforms: good validation data. When we were working on Gazebo for the DARPA Robotics Challenge, we never had an ATLAS robot. Getting the DEKA hand to do validation is huge.”
Validation is the process of making sure that commands sent to the simulated DEKA arm result in the same movements as identical commands sent to the real DEKA arm. “We send commands to the real hand and the simulated hand to see if they behave differently,” explains Hsu. “If they do, we update our model to make them match.” The closer the simulation matches, the more useful it will be to the HAPTIX teams. The end goal is, of course, to get everything working on the real hardware, but an accurate and detailed simulator is critical to the development of effective software.
The first generation of the DEKA arm recently arrived at OSRF for validation testing, and the complete hardware is expected before the end of the year. OSRF has been steadily releasing a series of stable versions of the HAPTIX simulator, and as the fidelity of simulated position holding, force control and response, and other dynamics are verified on the arm over the next few months, OSRF will continue upgrading the simulation software to make sure that the HAPTIX teams have all of the tools that they need to progress as quickly and efficiently as possible.
By early 2017, Phase 1 of HAPTIX will be complete, and the software and hardware components that prove to be the most successful will continue into Phase 2, the end goal of which is a complete, functional HAPTIX system. DARPA is hoping that take-home trials of such a system will happen by 2019, and that soon after, any amputee who needs one will be able to benefit from a prosthetic hand that acts (and feels) just like the real thing.
ROS at the Intel Developer Forum
Next week is the Intel Developer Forum in San Francisco.
If you know anything about ROS and robots, then by now you know about the integration between ROS and Intel’s RealSense Camera.
Given this relationship, you can expect to see and hear a lot about ROS next week at IDF.
We encourage you to check out these sessions next week in San Francisco:
- Wednesday 11:00 AM – 12:00 PM : Intel Robotics Overview (Level 2 Room 2004)
- Wednesday 1:15 PM – 2:15 PM : Intel RealSense Technology: Adding Human-like Sensing to Devices (Level 2 Room 2016)
- Wednesday 4:00 PM – 5:00 PM : Getting Started with the Intel RealSense Robotic Development Kit (Level 2 Room 2004)
- Thursday 9:30 AM – 11:45 AM : Introduction to Autonomous Robots (Level 2 Lab Room 2011)
Michael Ferguson (Fetch Robotics): Accelerating Your Robotics Startup with ROS
Michael Ferguson spent a year as a software engineer at Willow Garage, helping rewrite the ROS calibration system, among other projects. In 2013, he co-founded Unbounded Robotics, and is currently the CTO of Fetch Robotics. At Fetch, Michael is one of the primary people responsible for making sure that Fetch’s robots reliably fetch things. Mike’s ROSCon talk is about how to effectively use ROS as an integral part of your robotics business, including best practices, potential issues to avoid, and how you should handle open source and intellectual property.
Because of how ROS works, much of your software development (commercial or otherwise) is dependent on many external packages. These packages are constantly being changed for the better — and sometimes for the worse — at unpredictable intervals that are completely out of your control. Using continuous integration, consisting of systems that can handle automated builds, testing, and deployment, can help you catch new problems as early as possible. Michael also shares that a useful way to avoid new problems is to not immediately switch over to new software as soon as they are available: instead, stick with long-term support releases, such as Ubuntu 14.04 and ROS Indigo.
While the foundation of ROS is built on open source, using ROS doesn’t mean that all of the software magic that you create for your robotics company has to be given away for free. ROS supports many different kinds of licenses, some of which your lawyers will be more happy with than others, but there are enough options with enough flexibility that it doesn’t have to be an issue. Using Fetch Robotics as an example, Mike discusses what components of ROS his company uses in their commercial products, including ROS Navigation and MoveIt. With these established packages as a base, Fetch was able to quickly put together operational demos, and then iterate on an operating platform by developing custom plugins optimized for their specific use cases.
When considering how to use ROS as part of your company, it’s important to look closely at the packages you decide to incorporate, to make sure that they have a friendly license, good documentation, recent updates, built-in tests, and a standardized interface. Keeping track of all of this will make your startup life easier in the long run. As long as you’re careful, relying on ROS can make your company more agile, more productive, and ready to make a whole bunch of money off of the future of robotics.
~~~~~~~~~~~~~~~~~~~~
Next up: Ryan Gariepy (Clearpath Robotics)
ROSCon 2016: Proposal deadline July 8th and venue information
With just over 3 months to go before ROSCon 2016, we have some important announcements:
* The deadline for submitting presentation proposals is July 8, 2016. If you want to present your work at ROSCon this year, make sure to submit your proposal before the deadline: http://roscon.ros.org/2016/#call-for-proposals.
* The conference will be held at the Conrad Seoul. Hotel rooms at the discounted conference rate are limited! Reserve your room today. http://roscon.ros.org/2016/#location. Also listed are some options for child care during the conference, which we hope will be helpful for attendees traveling with families.
* Registration will open in a couple of weeks: http://roscon.ros.org/2016/#important-dates.
We can’t put on ROSCon without the support of our generous sponsors, who now include Clearpath Robotics, Southwest Research Institute, GaiTech, and ARM!
http://roscon.ros.org/2016/#sponsors
We’d like to especially thank our Platinum and Gold Sponsors: Fetch Robotics, Clearpath Robotics, Intel, Southwest Research Institute, and Yujin Robot.
Moritz Tenorth (Magazino): Maru and Toru — Item-Specific Logistics Solutions Based on ROS
It’s not sexy, but the next big thing for robots is starting to look like warehouse logistics. The potential market is huge, and a number of startups are developing mobile platforms to automate dull and tedious order fulfillment tasks. Transporting products is just one problem worth solving: picking those products off of shelves is another. Magazino is a German startup that’s developing a robot called Toru that can grasp individual objects off of warehouse shelves, a particularly tricky task that Magazino is tackling with ROS.
Moritz Tenorth is Head of Software Development at Magazino. In his ROSCon talk, Moritz describes Magazino’s Toru as “a mobile pick and place robot that works together with humans in a shared environment,” which is exactly what you’d want in an e-commerce warehouse. The reason that picking is a hard problem, as Moritz explains, is perception coupled with dynamic environments and high uncertainty: if you want a robot that can pick a wide range of objects, it needs to be able to flexibly understand and react to its environment; something that robots are notoriously bad at. ROS is particularly well suited to this, since it’s easy to intelligently integrate as much sensing as you need into your platform.
Magazino’s experience building and deploying their robots has given them a unique perspective on warehouse commercialization with ROS. For example, databases and persistent storage are crucial (as opposed to a focus on runtime), and real-time control turns out to be less important than being able to quickly and easily develop planning algorithms and reducing system complexity. Software components in the ROS ecosystem can vary wildly in quality and upkeep, although ROS-Industrial is working hard to develop code quality metrics. Magazino is also working on remote support and analysis tools, and trying to determine how much communication is required in a multi-robot system, which native ROS isn’t very good at.
Even with those (few) constructive criticisms in mind, Magazino says that ROS is a fantastic way to quickly iterate on both software and hardware in parallel, especially when combined with 3D printed prototypes for testing. Most importantly, Magazino feels comfortable with ROS: it has a familiar workflow, versatile build system, flexible development architecture, robust community that makes hiring a cinch, and it’s still (somehow) easy to use.
Next up: Michael Ferguson (Fetch Robotics)
Tom Moore: Working with the Robot Localization Package
Clearpath Robotics is best known for building yellow and black robots that are the research platforms you’d build for yourself; that is, if it wasn’t much easier to just get them from Clearpath Robotics. All of their robots run ROS, and Clearpath has been heavily involved in the ROS community for years. Now with Locus Robotics, Tom Moore spent seven months as an autonomy developer at Clearpath. He is the author and maintainer of the robot_localization ROS package, and gave a presentation about it at ROSCon 2015.
robot_localization is a general purpose state estimation package that’s used to give you (and your robot) an accurate sense of where it is and what it’s doing, based on input from as many sensors as you want. The more sensors that you’re able to use for a state estimate, the better that estimate is going to be, especially if you’re dealing with real-worldish things like unreliable GPS or hardware that flakes out on you from time to time. robot_localization has been specifically designed to be able to handle cases like these, in an easy to use and highly customizable way. It has state estimation in 3D space, gives you per-sensor message control, allows for an unlimited number of sensors (just in case you have 42 IMUs and nothing better to do), and more.
Tom’s ROSCon talk takes us through some typical use cases for robot_localization, describes where the package fits in with the ROS navigation stack, explains how to prepare your sensor data, and how to configure estimation nodes for localization. The talk ends with a live(ish) demo, followed by a quick tutorial on how to convert data from your GPS into your robot’s world frame.
The robot_localization package is up to date and very well documented, and you can learn more about it on the ROS Wiki.
Next up: Moritz Tenorth, Ulrich Klank, & Nikolas Engelhard (Magazino GmbH)
- « Previous Page
- 1
- 2
- 3
- 4
- 5
- 6
- …
- 24
- Next Page »