Tech Workshop on MoveIt, security & skill oriented programming with ROS

The Fall edition of ROS-Industrial EU Tech Workshop took place at Fraunhofer IPA on October 09th and 10th, 2019.

We were glad to host two European MoveIt maintainers, namely Henning Kayser of ROS-Industrial Consortium member PickNik Robotics and Michael Görner from University of Hamburg. They gave us an insight into the latest developments of MoveIt (incorporating motion planning, manipulation, 3D perception, kinematics, control & navigation), current and planned developments for ROS2 (MoveIt2), and a hands-on on ROS(1)-based 'bare-metal to product'. First they presented an inside-view of the manipulation framework. Providing complementary academic and industrial perspectives, they shared their views and experiences on MoveIt's overall structure, practical deployment of planning-based pipelines, complex manipulation planning using the MoveIt Task Constructor, and upcoming future projects and ideas for a ROS2 migration. The workshop concluded with a practical session that guided the participants to setup a functional Pick&Place pipeline from a custom bare robot description. Slides and code examples are available at https://github.com/henningkayser/ROS-Industrial_EU_Fall19_MoveIt .

20191009_112123.jpg

The first session on day 2 of the ROS-Industrial EU Fall'19 Workshop was about security in ROS where Sebastian Taurer from JOANNEUM RESEARCH presented his work on a penetration testing tool for ROS1, called 'ROSPenTo', and gave an introduction on how to use SROS2 to secure communications in ROS2. In the first part of the session ROSPenTo was introduced to provide basic information on how it works and what a user can do with it. During the hands-on section the participants were guided through a step-by-step manual showing how to analyse, penetrate and modify a running ROS1 system using ROSPenTo. In the second part of the session ROS2's security tools (a.k.a. SROS2) were explained and used to setup and configure a security infrastructure. The provided examples demonstrated the creation of all necessary security artefacts (e.g. keys, certificates, etc.) and also the procedure to securely distribute the artefacts to different machines. All the related information as well as the workshop tutorial can be found here: https://github.com/jr-robotics/ROS-Industrial_EU_Fall19_Workshop

EGg33rKWoAAHuv5.jpg

The ScalABLE4.0 session at the ROS Industrial EU Fall'19 Workshop focused on presenting the set of technologies which are enabling flexibility in production lines in two industrial pilots of the automotive sector: PSA Peugeot Citroën and Simoldes Plásticos. Within the project, a complete digital manufacturing software stack is being developed, entitled 'Open Scalable Production System' (OSPS). The OSPS aims to be applied to efficiently and effectively visualize, virtualize, construct, control, maintain and optimize production lines through a tight integration of the enterprise information systems with transformable automation equipment paired up with the necessary open interfaces for optimized solutions on all hierarchy levels (slides).

20191010_091701.jpg

During the workshop, attendees were introduced and got a chance to test, interact and develop with the set of components that compose the OSPS, namely: (i) The Advanced Plant Model, which is responsible for virtually integrating data from the industrial shop floor in a centralized digital twin; (ii) The Production Manager, which is a cloud-based software module that issues and supervises the execution of manufacturing tasks; (iii) SkiROS and Task Manager, which are distinct ROS-based approaches to orchestrating the behaviour of robotic systems; (iv) The Skill-based Robot Programming methodology, which enables the reutilization and adaptation of ROS-based robotic applications to different purposes, platforms, and environments; (v) and, finally, the ROS-CODESYS bridge (ROBIN - https://github.com/ScalABLE40/robin), which enables horizontal integration between robots and automation equipment.

As part of the Scalable project, Bjarne Grossmann from AAU and cofounder of RiACT presented their skill-based robot control software SkiROS v2 (slides). Their technology is based on extended behavior trees that allows the definition of reactive behavior for highly flexible manufacturing environments. The framework is backed by a semantic database for inference and support of task planning to automatically generate complex tasks. In the hands-on session, Bjarne demonstrated the system with a SkiROS-implementation of the classical ROS turtlesim demo. He showed that SkiROS can be easily used to create complex behavior (and not only for turtles). The demo can be found on the git repository https://github.com/Bjarne-AAU/skiros-demo. Soon, there will be an official open source release of the software. Stay tuned on www.riact.eu!

Next European expert workshops will be organized in Spring and Fall 2020. We will keep you posted!

PS: Some links to upcoming events in this respect:

Documentation updates improve ROS utilization and functionality

Lessons from UR driver updates reinforce importance of documentation

A key strength of the open-source community is the capacity to build on the knowledge of other developers who enable future advancements. Documentation plays a critical role in advancing understanding, which improves ROS utilization globally. This will be increasingly important as we move from ROS to ROS2 and document the various steps, including driver updates, necessary to execute projects.

Recent driver updates for a Universal Robots project helped demonstrate the importance of documentation to our team. In July 2019, Universal Robots updated their software for e-series and CB-series to 5.4 and 3.10. We were using the 5.4 software on UR 10e robots for a few different projects, and the ur_modern_driver [1] for ROS kinetic and melodic was no longer compatible due to this update. I updated the driver by investigating the release notes for 5.4.x.x [2] and the client_interface document [3].

UR10 E-series in the SwRI collaborative lab

To update the ROS driver for compatibility with software updates, I first identified what changes occurred and located appropriate software documentation for the hardware. The software documentation defined modules and variable types for the changes, which allowed for comparison with equivalent variables and modules in the current driver code. Without proper documentation from Universal Robots this would have been a much more difficult endeavor.

The ur_modern_driver specifically interacts with the client interface. Two variables were added to the 5.4 software client interface: a reserved byte in the Masterboard data sub package of Robot State Message to be used for internal UR use and a safety status value to the real time interface. The client interface document gave the types and sizes of these variables along with variables used in previous software versions.

The client interface for 5.3 and earlier had an internal UR int in Robot Mode Data I could compare to, and the safety status value could be compared to any of the other double variables in the previous driver’s RealTime interface. I used the search feature in QTCreator to find instances of these variables and added equivalent lines for the new ones. Then I used QTCreator’s debugger to track where new if statements and functions needed to be added to allow the driver to detect the new software version being used and access these new variables.

Working on this upgrade reinforced the necessity of good documentation. In general, the ur_modern_driver had more detailed documentation than many other ROS repos; however, it could still be improved. The README had no mention of the purpose of the use_lowbandwidth_trajectory_follower parameter in the launch files or the urXXe_bringup_joint_limited.launch file; both of these are useful when simulations are being overenthusiastic in their trajectory planning. I added documentation to the README to help others use these features to troubleshoot.

To update drivers, you will need to know what has been changed in the software, and you will ideally have access to a previous version of the driver. Because industrial hardware is intended to be reliable and accessible for multiple clients, there is often plenty of useful documentation if you can search with the correct terminology. Using the known changes and the documentation to compare with the previous driver code allowed me to update the driver fairly quickly so projects could move forward.

1https://github.com/ros-industrial/ur_modern_driver

2https://www.universal-robots.com/how-tos-and-faqs/faq/ur-faq/release-note-software-version-54xx/

3https://www.universal-robots.com/how-tos-and-faqs/how-to/ur-how-tos/remote-control-via-tcpip-16496/

YAK: 3D Reconstruction in ROS2

So why YAK?

It’s difficult for robots to perceive objects in the real world, especially when those objects are shiny, previously unseen, or (gasp!) both! We are excited to introduce Yak, an open-source GPU-accelerated ROS2 package which addresses some of these challenges using Truncated Signed Distance Fields. The Southwest Research Institute booth demo from the Automate 2019 conference serves as a case study for a ROS2 system that integrates Yak into its perception and motion planning pipeline.

A bit of technical background

A Truncated Signed Distance Field (TSDF) is a 3D voxel array representing objects within a volume of space in which each voxel is labeled with the distance to the nearest surface. The TSDF algorithm can be efficiently parallelized on a general-purpose graphics processor, which allows data from RGB-D cameras to be integrated into the volume in real time.

Numerous observations of an object from different perspectives average out noise and errors due to specular highlights and interreflections, producing a smooth continuous surface. This is a key advantage over equivalent point-cloud-centric strategies, which require additional processing to distinguish between engineered features and erroneous artifacts in the scan data. The volume can be converted to a triangular mesh using the Marching Cubes algorithm and then handed off to application-specific processes.

Machined Al Test Part

Machined Al Test Part

YAK Reconstruction of Machined Al Part

YAK Reconstruction of Machined Al Part

Reconstructing in the real world

My group at Southwest Research Institute has been working with TSDFs since Spring 2017. This year we started work on our first commercial projects leveraging TSDFs in industrial applications. We developed our booth demo at the 2019 Automate conference to provide an open-source example of this type of system.

The set of libraries we use in these applications is called Yak (which stands for Yet Another KinectFusion, in reference to the substantial prior history of TSDF algorithms). Yak consists of two repositories: a ROS-agnostic set of core libraries implementing the TSDF algorithm, and a repository containing ROS packages wrapping the core libraries in a node with subscribers for image data and services to handle meshing and resetting the volume. Both ROS and ROS2 versions of this node are provided. A unique feature of Yak compared to previous TSDF libraries is that the pose of the sensor origin can be provided through the ROS tf system from an outside source such as robot forward kinematics or external tracking, which is advantageous for robotic applications since it leverages information that is generally already known to the system.

The test system, which is in the pipeline for open-source release as a practical demonstration, uses a Realsense D435 RGB-D camera mounted on a Kuka iiwa7 to collect 3D images of a shiny metal tube bent into a previously-unseen configuration. The scans are integrated into a TSDF volume as the robot moves the camera around the tube.

The resulting mesh is processed by a ROS node using surface analysis functions from VTK to extract waypoints and tangential vectors along the length of the tube. These waypoints constitute the seed trajectory for a motion plan generated by our Descartes and Tesseract libraries which sweeps a ring along the tube while avoiding collision. Camera and turntable extrinsic calibration was performed using a nonlinear optimization function from the robot_cal_tools package.

A picture of the reconstructed mesh used to generate a robot trajectory is below. Additional information about the development of this demo (plus some neat video!) is available here. As ROS2 training is integrated into the ROS-I Consortium Americas curriculum, we look forward to sharing more lessons learned about applying Yak towards similar applications.

Screenshot from 2019-03-19 20-10-22.png

What next?

Yak is open-source as of July 2019. While development is ongoing and we anticipate that the library APIs will continue to evolve, we encourage interested parties to check it out at github.com/ros-industrial/yak and github.com/ros-industrial/yak_ros .

Field notes from Automate 2019, and why we’re bullish on ROS2

What makes a good industrial automation demonstration? When we started preparing for Automate 2019 back in January, a few key points came to mind. Our specialty in SwRI’s Manufacturing and Robotics Technology Department is advanced robotic perception and planning, so we decided that the robot should perform an authentic dynamic scan-and-plan process on a previously-unseen scene – as far away as we could get from a “canned” demo. We also wanted the demo to be an interactive experience to help drive discussion with visitors and entertain onlookers. These goals led us to the tube threading concept: a human would bend a piece of shiny metal tubing into a novel shape, and the robot would perceive it and plan a path to sweep a ring along it.

Michael Ripperger & Joseph Schornak on location at Automate 2019

Michael Ripperger & Joseph Schornak on location at Automate 2019

Developing a demo system presents an opportunity to explore new ideas in a low-risk environment because the schedule and deliverables are primarily internally-motivated. Since my group had limited previous exposure to ROS2, we decided that our Automate demo should use ROS2 to the greatest possible extent. The original vision was that the system would be entirely composed of ROS2 nodes. However, due to the practical requirements of getting everything working before the ship date, we decided to use a joint ROS/ROS2 environment, with ROS motion planning and the GUI nodes communicating with the ROS2 perception nodes across the ROS-to-ROS2 bridge

ROS2 Strengths and Challenges

In contrast to virtually every other robotics project I’ve worked on, the demo system’s perception pipeline worked consistently and reliably. Intel maintains a ROS2 driver for Realsense RGB-D cameras, which allowed us to use the D435 camera without any customization or extra development. Our YAK surface reconstruction library based on the Truncated Signed Distance Field algorithm helped us avoid the interreflection issues that would usually plague perception of shiny surfaces. After a couple afternoons spent learning how to use new-to-me VTK libraries, the mesh-to-waypoint postprocessor could consistently convert tube scans into trajectory waypoints. More information about this software is available from the SwRI press release or the writeup in Manufacturing Automation.

Block Diagram of SwRI ROS-I Automate 2019 Demonstration

Block Diagram of SwRI ROS-I Automate 2019 Demonstration

Motion planning turned out to be a particularly challenging problem. Compared to a traditional robot motion task like pick-and-place, which involves planning unconstrained paths through open space, the kinematic constraints of the tube threading problem are rather bizarre. While the ring tool is axially underconstrained and can be rotated freely to the most convenient orientation, it is critical that it remain aligned with the axis of the tube to avoid collision. It’s impossible to flip the ring once it’s over the tube, so if the chosen ring orientation causes the robot to encounter a joint limit halfway down the tube, tough luck! Additionally, the robot must avoid collision between the tube and robot hardware during motion. Our initial solution used Trajopt by itself, but it would sometimes introduce unallowable joint flips since it tried to optimize every path waypoint at once without a globally-optimal perspective on how best to transition between those waypoints. We added the Descartes sampling algorithm, which addressed these issues by populating Trajopt’s seed trajectory with an approximate globally-optimal path that satisfied these kinematic and collision constraints. Planning still failed occasionally: even with a kinematically-redundant Kuka iiwa7 arm, solving paths for certain tube configurations simply wasn’t feasible[^1].

TrajOpt Path Planning Implementation & Testing

TrajOpt Path Planning Implementation & Testing

[^1]: The extent of solvable tube configurations could be greatly increased by including the turntable as a controllable motion axis. Given the constraints of the iiwa7’s ROS driver, we decided that this would be, in technical software terms, a whole other can of worms.

We shipped the robot hardware about a week in advance of the exhibit setup deadline. Our reliance on ROS meant we could switch to simulation with minimal hassle, but there were some lingering issues with the controller-side software that had to wait until we were reunited with the robot the Saturday before the show[^2]. This contributed to moderate anxiety on Sunday evening as we worked to debug the system using real-world data. We had to cut some fun peripherals due to time constraints, such as the handheld ring wand that would let visitors race the robot. By Tuesday morning the robot was running consistently, provided we didn’t ask it to solve paths for too-complicated tubes. This freed up some time for me to walk the halls away from our booth and talk to other exhibitors and visitors.

[^2]: Our lunch upon arrival was Chicago-style deep dish pizza, which conveniently doubled as dinner that evening.

More Collaborative Robots

There were collaborative robots of all shapes and sizes on display from many manufacturers. I may have seen nearly the same number of collaborative robots as traditional ones! A handful were programmed to interact with visitors, offering lanyards and other branded largesse to passersby. Most of them were doing “normal robot things,” albeit intermingled with crowds of visitors without any cages of barriers, and generally at a much more sedate pace compared to the traditional robots. Some of the non-collaborative robots were demonstrating safety sensors that let them slow down and stop as visitors approached them -- I usually discovered these by triggering them accidentally.

I was surprised by the number of autonomous forklifts and pallet transporters. I’m told that there were more in 2019 than at previous shows, so I’m curious about what recent developments drove growth in this space.

I learned that ROS-Industrial has significant brand recognition. I got pulled into several conversations solely because I was wearing a ROS-I polo! Many of these discussions turned to ROS2, which produced some interesting insights. Your average roboticist-on-the-street is aware of ROS2 (no doubt having read about it on this very blog), but their understanding of its capabilities and current condition might be rather fuzzy. Many weren’t sure how to describe the key differences between ROS and ROS2, and a few weren’t even aware that ROS2 has been out in the wild for three versions! I’ll unscientifically hypothesize that a key challenge blocking wider ROS2 adoption is the lack of demonstrated success on high-visibility projects. Our demo drove some good conversation to alleviate these concerns: I could show a publicly-visible robotic system heavily reliant on ROS2 and point to the open-source native ROS2 device drivers that let it function.

Showcasing Perception and Planning Potential

In terms of demo reception, people who visited our booth were impressed that we were scanning and running trajectories on previously-unseen parts. I usually had to provide additional context to show how our perception and planning pipeline could be extended to other kinds of industrial applications. There’s a tricky balance at play here – an overly abstract demo requires some imagination on the part of the viewer to connect it to an industrial use case, but a highly application-specific demo isn’t easily generalized beyond the task at hand. Since our group specializes in application-generic robot perception and planning, I think that a demo tending towards the abstract better showcases our areas of proficiency. This is a drastically different focus from other exhibits at the show, which generally advertised a specific automation process or turnkey product. I feel like we successfully reached our target audience of people with difficult automation tasks not addressed by off-the-shelf solutions.

Development of the Industrial YAK reconstruction for the Automate Demo in ROS2

Development of the Industrial YAK reconstruction for the Automate Demo in ROS2

While it certainly would have been easier to adapt an already-polished system to serve as a show demo, developing a completely new one from scratch was way more fun. Improvements made to our perception and planning software were pushed back upstream and rolled into other ongoing projects. We’re now much more comfortable with ROS2, to the extent that we’ve decided that from here on out new robotics projects will be developed using ROS2. The show was a lot of fun, a great time was had by all, and I hope to see you at Automate 2021!

ROS-Industrial Consortium Europe is heading towards ROS2

With the growing excitement and curiosity surrounding ROS2, ROS-Industrial Consortium Europe (RIC-EU) had the pleasure to host the Spring 2019 edition of the RIC-EU Tech Workshop. It took place on May 6th and 7th at Fraunhofer IPA in Stuttgart, Germany. Some of the main drivers of DDS and ROS2 developments personally presented their insights and gave hands-on sessions during the event. For this, participants were provided with USB sticks with Ubuntu Bionic and ROS Melodic and ROS Crystal pre-installed (just as for all our ROS-Industrial trainings). The event has been free for worldwide members of any ROS-Industrial Consortium and was fully booked out with 40 people attending from all over Europe.

On Day 1, the workshop started with RIC-EU manager Thilo Zimmermann who welcomed the participants at Fraunhofer IPA and introduced the ROS-Industrial Consortium Europe and its EU project funding opportunity (next cut-off dates June 14 and September 13, 2019).

As ROS 2 supports multiple DDS/RTPS implementations, RIC-EU proudly hosted one of the most popular DDS vendors, eProsima, to explain the main concepts of DDS and present their stack at the workshop. During the five hours of presentations and hands-on workshops, Borja Outerelo Gamarra and Jaime Martin Losa covered topics like DDS Introduction, presentation of the standard and motivation of DDS & DDS Architecture, and DDS QoS. Attendees practised on a “hello world” example. ePROSIMA's slides can be found here.

20190506_142557[1].jpg

On Day 2, Ralph Lange from RIC-EU member BOSCH gave an in-depth presentation of the current status of ROS2. He included hands-on tasks using ROS2 and sow new features and also provided information on the upcoming d-turtle “Dashing Diademata” release on May 31, 2019. Ralph's presentation slides "Current Status of ROS2 - Hands-on Feature Overview" can be found here.

20190507_090637[1].jpg

The second presentation by Ingo Lütkebohle, also from BOSCH Corporate Research, introduced the micro-ROS activity. Ingo is one of the investigators of the EU funded OFERA project, which ports ROS2 to “extremely resource constrained devices” (usually, microcontrollers) with the new DDS XRCE standard. He demonstrated this by using a Cortex M4 board mounted on a first generation Turtlebot. Ingo's presentation slides can be found here.

20190507_113032[1].jpg

After a lunch break, Ludovic Delval of Fraunhofer IPA gave a hands-on workshop on how to migrate ROS1 node to ROS2. Lastly, Harsh Deshpande, also from Fraunhofer IPA, previewed the porting of the ur_modern_driver to ROS2 and presented a proposal for the action_bridge, which currently bridges between ROS1 action client and ROS2 action server.

At the end of the workshop, participants and ROS-Industrial Consortium members agreed that 2019 is promising a lot of developments in ROS2. In April at ROS-I Consortium Americas 2019 Annual Meeting, RIC members interacted and exhibited an interesting panel session titled “Is ROS2 Ready for the Factory Floor”. In June, Ludovic Delval of Fraunhofer IPA will present the latest updates at ROSCon France in Paris and Harsh Deshpande at the ROS-Industrial AP Workshop 2019 in Singapore.

The next RIC-EU Tech Workshop is foreseen for Fall 2019 (tentative dates October 09-10). The 2019 edition of the ROS-Industrial Conference is planned on December 10-12, 2019 (save the date!).

ROS-Industrial Americas 2018 Annual Meeting Review

The ROS-Industrial Consortium Americas (RICA) held its 2018 Annual Meeting in San Antonio, on the campus of Southwest Research Institute (SwRI) on March 7th and 8th, 2018. This was a two-day event, with the 7th open to the public, including tours and demonstrations, followed by Consortium Members meeting on the 8th with a road-mapping exercise and project idea brainstorming.

This was the first time that RICA held the event over two full days. Also, this was the most well attended event, topping out over 80 people on the 7th. There were talks spanning from the more strategic/visionary to the technical with regards to open-source robotics application development. This provides an excellent cross-section of the technical development community and organization decision makers to share ideas and cross-pollinate taking back what they learned to their organizations.

The morning of the 7th featured:

  • SwRI Introduction - Paul Evans - SwRI
  • ROS-I Consortium/Introduction - Matt Robinson - SwRI
  • Manufacturing in Mixed Reality - Dr. Aditya Das - UTARI
  • Discussion on the Design of a Multiuse Workcell and Incorporation of the Descartes Package - Christina Petlowany - UT Austin Nuclear Robotics Group
  • Integrating ROS into NASA Space Exploration Missions - Dustin Gooding - NASA

The talks touched on a mix of how humans can interact with the technological solutions and also the need for solutions that can work within environments originally designed for people. The common thread is enabling humans and robots to work more efficiently within the same spaces, and leveraging the same tools.

Rick Meyers of the ARM Institute & Air Force Research Laboratory, during the lunchtime keynote, discussed the vision and motivations of Air Force ManTech to drive advancements in automation and robotics in the manufacturing environment. This tied into the motivation of the Advanced Automation for Agile Aerospace Applications (A5) program, and how ROS ties into the realization of the Air Force ManTech vision.

The tours and demonstrations included many different applications, all with either ROS/ROS-Industrial element, though in some cases complimentary. ADLINK Neuron focused on coordinated mobile robots and a means to assist their industrial partners to easily transition to the ROS2 environment and provide consulting services for DDS implementation and ROS-related algorithm development.

KEBA demonstrated their new ROS RMI interface integrated into their controller, while UTARI demonstrated Manufacturing in Mixed Reality implemented through the Microsoft HoloLens, allowing users to fuse process guidelines, real-time inspection data, and cross reference information to determine adaptive measures and project outcomes.

SwRI and the ROS-I team demonstrated an example of merging SwRI’s Human Performance Initiative’s Markerless Motion Capture combined with path planning to retrieve an object from an open grasp. SwRI’s Applied Sensing Department showcased their Class 8 truck enabling all attendees to go for a ride, while gaining insights to the vehicle’s capabilities. The ROS-I team at SwRI also presented Robotic Blending Milestone 4, Intelligent Part Reconstruction, with TSDF implementation, and Trajopt, a newly fully-integrated into ROS sequential convex optimizer. The UT Austin Nuclear Robotics Group demonstrated their improved situational awareness for mobile manipulation on their Husky platform where users could “drive” the system to pick up a presented object.

Finally, the SwRI team presented and demonstrated the A5 platform, which is a mobile manipulation platform designed to perform numerous processes on large aircraft in an unstructured setting. The process demonstrated was sanding of a test panel overhead. Overviews of the localization and planning on the visualization were included.

Talks for the afternoon centered around OEM and Integration service providers, and included:

  • ADLINK Neuron: An industrial oriented ROS2-based platform - Hao-Chih Lin - ADLINK
  • Unique ROS Combination with Safety and PLC - Thomas Linde - KEBA
  • Leveraging ROS-Industrial to Deliver Customer Value - Joe Zoghzoghy - Bastian Solutions

This set of talks brought home innovations by the OEM and service provider communities. Bastian Solutions’ story of concept via working with the ROS-Industrial team, through pilot and into production, demonstrated a real value proposition for mobile solution, and broader ROS-enabled, development for the integrator community.

The morning of the 8th featured:

  • RIC-Americas Highlights and Upcoming Events - Matt Robinson & Levi Armstrong - SwRI
  • RIC-Europe Highlights & ROSiN Update - Mirko Bordignon - Fraunhofer IPA
  • ROS-Industrial Lessons from Bootstrapping in Asia Pacific - Min Ling Chan - ARTC
  • ROS2 is Here - Dirk Thomas - Open Robotics
  • ARM Institute Introduction & Update - Bob Grabowski - ARM Institute
  • Windows IoT & Robotics - Lou Amadio - Microsoft

Matt Robinson covered strategic initiatives for the Consortium followed by Levi Armstrong covering RICA technical developments, including TrajOpt and Intelligent Part Reconstruction, Noether, PCL Afront Mesher, and Qt Creator updates and upcoming release.

Mirko Bordignon highlighted for the Americas audience what is happening around the ROSIN initiative, driving awareness, and furthering the global nature of ROS-I. Min Ling Chan shared progress within the Asia-Pacific region and the progress and status of the Pack ML Focused Technical Project, which has a Phase 2 launch coming soon.

Dirk Thomas of Open Robotics presented the latest on ROS2, and for the first time we were happy to welcome Bob Grabowski of the ARM Institute. The ARM Institute is the newest DoD Manufacturing Innovation Institute, and this is the first Annual Meeting since the Institute’s launch. Synergies between the ARM Institute and ROS-I will be important to monitor moving forward.

The morning session concluded when the Windows IoT and Azure teams were represented respectively by Lou Amadio and Ryan Pedersen, presenting their current strategy for ROS support and their plans moving forward, particularly for ROS2.

The featured keynote was presented by Dr. Phil Freeman of Boeing, “Why Boeing is Using ROS-Industrial.” Phil offered great insights to the value of ROS-Industrial for Boeing, and what it has enabled for their operations in the context of the challenges Boeing faces. The talk featured example applications and conveyed the message that within the robotics space we truly are at a tipping point with regards to capability and accessibility.

A road-mapping session was then conducted, focusing on problems to solve. The idea is to tie problems to projects and then identify the capabilities that need to be developed to meet certain prioritized problems. The problem focus areas were Human Capability, Quality Processes and Execution, Flexibility/Agility, and Strategy/Alignment. Common themes were: standard interfaces, documentation, ROS2 for Industrial applications, ownership and community engagement, simpler recovery means, and real-time diagnostics.

The afternoon speaker session touched on technologies that seek to enable richer and more reliable networking and data sharing/management through the application development/implementation process, and across the value stream:

Now that the dust has settled, these are some observations from this seat:

  1. ROS-Industrial is a big tent, and is truly global. Each Consortium needs to optimize how it works within their region to meet their member needs and optimally leverage resources available to them.
  2. As regional resources are optimized, the other consortia need to monitor developments, share information and ensure that all within the broader ROS-I organization are aware what is in-flight, what development activities are happening where, to reduce/eliminate redundant efforts.
  3. ROS2 is here, but there is work to do. It will be important to monitor developments and foster awareness to enable developers, solution providers, and end users to leverage ROS2 capability to complement their end solutions when and where appropriate.
  4. There are a number of innovators, solution providers, and end users realizing value proposition on ROS/ROS-Industrial deployments TODAY, and in some cases for some time. Let’s socialize and share their success stories.
  5. Foster both membership engagement and community engagement in the vision and execution of the vision for ROS-Industrial. We are excited to both enable start-ups to engage, but also improve how we leverage our University partners. Through effective projects, sponsorships, or roles within the ROS-I organizational structure, these all help foster a sense of community and subsequent ownership.
  6. There is an inflection point or tipping point, and for advanced robotics this seems to be an appropriate time. The idea also, that ROS can span beyond just the robotic processes, but do more to enable more intelligent processing via leveraging IoT, enable leverage of advanced technologies for further end user value seems to be gaining steam.
  7. We advance ROS-Industrial together. Engage, participate, communicate, and we succeed together.

As always, we are looking forward to feedback on the event and how to improve this event and events moving forward. We are looking forward to bringing back the online quarterly membership meetings, so keep an eye on that, as coordination and the invites are hosted on a rotational basis by the three Consortium managers. ROS-Industrial is an open-source project, and with that we seek to be open, and a be that forum for sharing ideas, and solving problems for industry in the 21st century.

Public day presentations can be found on the Event Page within the agenda after each speaker line item. Member day presentations are included behind the member portal, and are available for download.

Thanks for your support of open-source automation for industry!