Professor of Biorobotics Martijn Wisse, also coordinator of Factory-in-a-day, is the new Scientific Director of TU Delft Robotics Institute. During the yearly associates day of the Institute on 24 January, 2017, Prof. Wisse was officially appointed.
Read more on the news section of their website: http://tudelftroboticsinstitute.nl/news/professor-martijn-wisse-leads-tu-delft-robotics-institute
This deliverable video focuses on the dynamic obstacle avoidance which is an essential component to ensure safety in the robot
environment and get the robots collaborate with fellow human beings thus improving the efficiency of the processes in the factory environment which is one of the goals of the project.
In terms of technology, Skin Sensors from TUM, Reactive Path Planner from SIEMENS-PLM, and Reactive Controller from LAAS makes us confident in achieving this task. This deliverable makes an attempt to combine all the components to get a manipulation scenario running to illustrate the dynamical obstacle avoidance capability. This simulation video is a proof-of-concept for the future deployment in the real robot. The illustration is done on the robot TOMM setup with skin sensors on the right forearm of the robot.
The two excerpts in the deliverable video are as follows:
1. Firstly, the reactive dynamic obstacle avoidance behavior using the simulated skin sensors with ‘Stack of Tasks’ (SOT), the reactive
controller driving the robot.
2. Secondly, the manipulation scenario which shows the use of pcl (point cloud library) based planner and the reactive SOT controller to avoid obstacles.
Augmenting awareness by HoloLens: initial exploration
Author/Institution: Jouke Verlinden/TU Delft
This video is a first exploration on how Augmented Reality can be used in production planning and during deployment & operation. It is part of Deliverable 4.4.
- HoloLens: Although we considered projector-based and non-visceral augmentation, the use of a stereoscopic head-mounted display allows to provide better visual (and auditory) feedback. Specifically, if considering a portable solution, being able to be used during the workflow of the Factory in a Day process. In particular, see-through options allow a proper overlap with the environment, and with the emergence of the Microsoft HoloLens and the Meta 2 glasses are enabling these technologies in a proper, ergonomically fitting way. The scenario on motion path and training provides an indication how system state can be overlayed in the physical environment.
- How do you plan to use feature 1 of HoloLens in Factory-in-a-day (scene reconstruction)?
Scene reconstruction is essential in rendering occlusions, which raises the effect of Presence (notion as ‘being-there’). For example, at 2:10 in the video, only a part of the digital objects is rendered as a wall blocks the view. Another important aspect is that this enables a computational integration between spatial context, robot systems, and workers, and can be used to reason on situational awareness of individual workers (in terms of visibility, system status etc.).
- How do you plan to use feature 2 of HoloLens in Factory-in-a-day (object placement)?
Together with feature 1, this option shows how current gesture recognition is done with the HoloLens (so-called “Air tap” and gaze following). Furthermore, these features could be used during maintenance or configuration of robotic systems.
- What are the next steps? This is a first impression based on off-the shelf software. Connectivity with ROS and assessing usability of specific cues will be done during the coming 10 months.
Factory-in-a-day is mentioned in an article in the Control Engineering magazine:
Nowadays robotic assembly solutions are becoming easier to put together due to standardized hardware interfaces and specific “designed for” add-ons. However, on a software level, the configuration and operation of these extensions is becoming more complex, due to the greater number of configuration options and extensions leaning more and more on software for their functionality (e.g. vision systems).
The URCaps (UR Capabilities) are hardware and/or software extensions for the Universal Robot system. The purpose of the URCaps is to seamlessly extend any Universal Robot with customized functionality. Using the URCap software platform, third parties can define graphical user interfaces that seamlessly integrate with the UR workflow and provide device drivers for their hardware.
The research done in Factory-in-a-day project has contributed to the following features in the URCaps software platform:
- Workflow integration – Third party developers can provide custom installation extensions (see Figure 1) and custom program nodes (see Figure 2). The installation stores information and provides an interface for a specific hardware setup, i.e. settings that are valid for any program made with this hardware configuration. Custom program nodes can be used to hide complicated behavior and provide a convenient graphical user interface for the end-customer.
- Device drivers – Many hardware extensions require device drivers for the robot program to communicate with the hardware extension or an extension might require a daemon process to run on the robot. The URCaps software platform provides a generic way to install and run a daemon process.
- Real-Time Data Exchange (RTDE) – Reliably exchange data between UR robot controller and third party process to implement hierarchical control loops or monitoring software. Request specific robot state data (incl. registers) to be output at a specified rate. Input custom data (e.g. setpoints) through registers and use it in your program. Streaming setup is on a per connection basis and watchdogs are available to guard the input connection status.
- XML-RPC – XML encoded Remote Procedure Calls for URScript. The RPCs almost look like calling normal URScript functions, e.g. camera.get_next_object_coordinates(“bolt”, “M8”). However, RPCs are executed by a different software stack than the UR controller, which might be on a different computer and might have different software installed (e.g. image processing libraries). Any type used by URScript can be transmitted as an argument or received as return value. Due to their simple usage and flexibility the RPCs are great for configuration and service calls, e.g. to setup a device, do a computationally intensive calculation or combine URScript with other software packages.
The main benefits for third-party developers are:
- Easy integration & standardization – Seamless integration with the “normal” PolyScope workflow. No more “hacking” around to integrate software.
- Better support possibilities – The URCaps software platform provides among others explicit interfaces and versioning. Configuration mistakes can be detected during programming time and simple coding errors are eliminated.
- Sales / Marketing platform – Well-made URCaps can be featured on the Universal Robots+ website. Distributors and integrators can present accessories, which run successfully at end users, to each other.
- New opportunities – New opportunities for integrating products and services, and tapping into new information sources.
The main benefits for end-customers / integrators are:
- Lower installation time – By providing easy-to-use graphical user interfaces the technical expertise required to install hardware is lowered. Furthermore, less time is needed for installation & setup, since the graphical interface can guide one through to process and provide real-time feedback.
- Faster programming – By lowering the threshold for configuring and programming robots, URCaps create a potential for highly reconfigurable solutions in the production environment. The robot becomes a general purpose tool rather than a highly specialized setup.
- Lower project risk – URCaps featured on the Universal Robots+ website have run successfully at end users and are dedicated to UR robots.
- Shorter lead time to implement robot applications – Access to well-proven technology will help to realize new automation solutions quicker.
The following two videos are made in cooperation with On Robot ApS and Robotiq. The videos show how the lower installation time and faster programming for their end-customers has been achieved with the URCaps software platform. At the end of each video the benefits for the end-customer and for URCaps developers are listed separately.
We have a new video deliverable online on Deliverable 5.4: “A model based task specification that includes programming by demonstration aspects”
Have a look at: http://www.factory-in-a-day.eu/media/videos/
- PAL Robotics: in Barcelona
PAL ROBOTICS OPEN DAY – 25 NOV #ERW2016
Discover what’s behind our robots on Friday November 25th! Let us know if you want to come at one of the tours sending an e-mail to firstname.lastname@example.org:
– Technical tour (12:00 – 14:00h)
– General tour (16:30 – 18:00h)
The team and robots are looking forward to meet you!
- TU München – Chair for Cognitive Systems
On Thursday, November 24, 2016, the Chair for Cognitive Systems cordially invites everybody to come and visit our chair for an open lab afternoon. This is a unique possibility to learn more about the robotics research at the chair. We will introduce some of our research topics, so just come by. We are happy to answer all questions!
This event is part of the European Robotics Week 2016 and offers one week of various robotics related activities across Europe for the general public.
Where: Munich, Karlstr. 45, 2nd floor Time: 17h-19h
More detials and other events on http://www.eu-robotics.net/robotics_week/events/index.html
In Work Package 5 – Learnable Skills – our partner Universal Robots developed an assembly skill with their existing robot controller and GUI. The intuitive programming of this assembly skill was the challenge. The videos are part of Deliverable 5.4 “A model-based task specification that includes PbD aspects (Programming by demonstration)”.
The two videos are available in our video section.
At the end of October, Factory-in-a-day had its second project meeting in this year. This time we were in Barcelona, at our partner PAL Robotics.
The focus of this meeting was to set the plan for the final year of the project. Even though we progressed towards the project’s goal of reducing the time for integrating robotic solutions in an assembly chain in a short period of time, there was still a lot to discuss and talk about. On which demonstrations will we focus and which integration work has to be done?
Here is also a short blog post on our meeting at the website of PAL Robotics.
The paper “Understanding the Intention of Human Activities through Semantic Perception: Observation, Understanding and Execution on a Humanoid Robot” by Karinne Ramirez-Amaro, Michael Beetz & Gordon Cheng is now available for download by the publishing company: