European Robotics Forum 2017

Meet Factory-in-a-day at the European Robotics Forum 2017 in Scotland. We are part of the Workshop on Hybrid Production Systems on March 23!

Agenda:

  • 10:45 – 10:55 Introduction of Workshop and recap of HPS booth at AUTOMATICA 2016
  • 10:55 – 11:15 Early Results (moderated by Inaki Martua)
  • 11:15 – 12:15 Interaction Technologies (moderated by Sotiris Makris)
  • 12:15 – 14:00 Lunch Break
  • 14:00 – 14:15 Applications (moderated by Ramez Awad)
  • 14:15 – 15:15 Safety Technologies (moderated by George Michalos)
  • 15:15 – 15:30 Discussion of next Steps

Saftey technologies:

Prof. Gordon Cheng (Technical University of Munich): The artificial skin in Factory-in-a-day

One important aspect of Factory-in-a-day is to provide safe robots (arms and mobile robots). This can be achieved with a novel proximity-sensing skin and dynamic contact-avoiding behaviours, allowing ubiquitous use of robots in shared workspaces with humans. The presentation will explain the advantages and potential of using the self-configuring skin developed by TUM.

Dr. Carlos Hernandez Corbato (TU Delft): Robot software in human-robot collaboration

Given its widespread adoption, ROS can be considered the de-facto standard software framework for service and mobile robotics. It is rapidly expanding into the industrial domain also thanks to initiatives such as ROS-Industrial. Its features are an excellent match for the advanced capabilities required in applications involving human-robot collaboration (HRC). However, the lack of quality assurance for ROS components hinders their use in HRC applications, where the requirements for robustness and dependability are critical given the safety implications. To address this situation, as part of the Factory-in-a-Day (FiaD) project toolbox we developed the Automated Test Framework (ATF). The ATF supports executing integration and system tests, running benchmarks, and monitoring the code behaviour over time. It is one of a number of tools developed within public and private projects with the overall goal to provide better tooling and thus improve the quality and streamline the development & deployment process of ROS-based robotics software.
Looking forward to meeting you there.

http://www.erf2017.eu/wp-content/uploads/2017/01/ERF2017-Schedule-v1.1.pdf

Read More →

ROS Integration Workshop

2017-01-FiaD-Meeting-01

Participants of the workshop on ROS tools

At the beginning of February, 2017, the first Factory-in-a-day robot software integration workshop took place.  Together with other EU-funded projects working on ROS tools for scanning, testing, and deploying software, the goal was to  bring together developers and system integrators in one workshop. Here they could develop and test ROS tools that can be used to ensure software quality and reliable and quick deployment. The following slides of the presentations are available here:

The Automated Test Framework – introduction – tool for runtime testing (i.e., checking quality “a posteriori”)(presentation by Florian Weisshardt)

The deployment infrastructure – introduction – tool for deploying code faster (presentation by Mathias Luedtke)

HAROS: static analysis of ROS code   – tool for code scanning (i.e., checking quality “a priori”) (presentation by André Santos)

Other participants were from the projects ROSIN and Scalable 4.0.

 

Read More →

Integration Workshop on ROS-based tools

Integration_Workshop

Participants of the first Integration Workshop.

On January 31 and February 1, 2017, Factory in a day organized a first robot software integration workshop on ROS tools for scanning, testing, and deploying software. Together with other EU-funded projects using the same ROS tools, the goal was to exploit ROS potential.

Apart from our project, the two projects ROSIN and Scalable 4.0 were participating. The latter two only just started this year, whereas we are in the final one.

Thanks to Fraunhofer IPA for hosting the workshop.

Read More →

Factory-in-a-day at ERF2017

The eighth edition of the European Robotics Forum will take place in Edinburgh, Scotland, UK from the 22nd till the 24th of March 2017 (http://www.erf2017.eu). Within this context the 4th Workshop on Hybrid Production Systems will take place on March 23rd. The EU-funded project Factory-in-a-day will present two presentations.Robotics_forum_lscape_logo

This workshop is dedicated to presenting the latest technologies, research results facilitating Human Robot Collaboration (HRC) in an industrial setting, e.g. Applications of augmented reality and wearables, Safety, Interaction, Planning, Simulation, Tele-Operation, etc.

More details also in the press release: http://www.factory-in-a-day.eu/media/press-material/

 

 

Read More →

Online survey for Factory-in-a-day

picture: Thorben Wengert / pixelio.de

picture: Thorben Wengert / pixelio.de

If you are working in a SME company – or even are the owner of one, we would be very happy if you could spare a few minutes to answer our online-questionnaire:

http://ww3.unipark.de/uc/robotics

Thank you very much in advance!

 

Read More →

Prof. Wisse new Scientific Director of TU Delft Robotics Institute

Professor of Biorobotics Martijn Wisse, also coordinator of Factory-in-a-day, is the new Scientific Director of TU Delft Robotics Institute. During the yearly associates day of the Institute on 24 January, 2017, Prof. Wisse was officially appointed.

Read more on the news section of their website: http://tudelftroboticsinstitute.nl/news/professor-martijn-wisse-leads-tu-delft-robotics-institute

Read More →

New video on “Reactive Path Planning and Motion Control (Deliverable 4.4)”

This deliverable video focuses on the dynamic obstacle avoidance which is an essential component to ensure safety in the robot
environment and get the robots collaborate with fellow human beings thus improving the efficiency of the processes in the factory environment which is one of the goals of the project.vlcsnap-2017-01-10-14h34m29s260
In terms of technology, Skin Sensors from TUM, Reactive Path Planner from SIEMENS-PLM, and Reactive Controller from LAAS makes us confident in achieving this task. This deliverable makes an attempt to combine all the components to get a manipulation scenario running to illustrate the dynamical obstacle avoidance capability. This simulation video is a proof-of-concept for the future deployment in the real robot. The illustration is done on the robot TOMM setup with skin sensors on the right forearm of the robot.
The two excerpts in the deliverable video are as follows:
1. Firstly, the reactive dynamic obstacle avoidance behavior using the simulated skin sensors with ‘Stack of Tasks’ (SOT), the reactive
controller driving the robot.
2. Secondly, the manipulation scenario which shows the use of pcl (point cloud library) based planner and the reactive SOT controller to avoid obstacles.

Read More →

New video: Augmenting awareness by HoloLens

hololens

HoloLens used in the Factory-in-a-day project.

Augmenting awareness by HoloLens: initial exploration

Author/Institution: Jouke Verlinden/TU Delft

This video is a first exploration on how Augmented Reality can be used in production planning and during deployment & operation. It is part of Deliverable 4.4.

  1.  HoloLens: Although we considered projector-based and non-visceral augmentation, the use of a stereoscopic head-mounted display allows to provide better visual (and auditory) feedback. Specifically, if considering a portable solution, being able to be used during the workflow of the Factory in a Day process. In particular, see-through options allow a proper overlap with the environment, and with the emergence of the Microsoft HoloLens and the Meta 2 glasses are enabling these technologies in a proper, ergonomically fitting way. The scenario on motion path and training provides an indication how system state can be overlayed in the physical environment.
  2. How do you plan to use feature 1 of HoloLens in Factory-in-a-day (scene reconstruction)?
    Scene reconstruction is essential in rendering occlusions, which raises the effect of Presence (notion as ‘being-there’). For example, at 2:10 in the video, only a part of the digital objects is rendered as a wall blocks the view. Another important aspect is that this enables a computational integration between spatial context, robot systems, and workers, and can be used to reason on situational awareness of individual workers (in terms of visibility, system status etc.).
  3. How do you plan to use feature 2 of HoloLens in Factory-in-a-day (object placement)?
    Together with feature 1, this option shows how current gesture recognition is done with the HoloLens (so-called “Air tap” and gaze following). Furthermore, these features could be used during maintenance or configuration of robotic systems.
  4. What are the next steps? This is a first impression based on off-the shelf software. Connectivity with ROS and assessing usability of specific cues will be done during the coming 10 months.

Read More →

New article on Factory-in-a-day in Control Engineering magazine

Factory-in-a-day is mentioned in an article in the Control Engineering magazine:

“Easier, safer robotic programming is among results from an expanding open-source robotics software group”

 

Read More →

How Factory-in-a-day contributed to URCaps

Nowadays robotic assembly solutions are becoming easier to put together due to standardized hardware interfaces and specific “designed for” add-ons. However, on a software level, the configuration and operation of these extensions is becoming more complex, due to the greater number of configuration options and extensions leaning more and more on software for their functionality (e.g. vision systems).

The URCaps (UR Capabilities) are hardware and/or software extensions for the Universal Robot system. The purpose of the URCaps is to seamlessly extend any Universal Robot with customized functionality. Using the URCap software platform, third parties can define graphical user interfaces that seamlessly integrate with the UR workflow and provide device drivers for their hardware.

The research done in Factory-in-a-day project has contributed to the following features in the URCaps software platform:

ur_caps1

Figure 1: Custom installation node

  • Workflow integration – Third party developers can provide custom installation extensions (see Figure 1) and custom program nodes (see Figure 2). The installation stores information and provides an interface for a specific hardware setup, i.e. settings that are valid for any program made with this hardware configuration. Custom program nodes can be used to hide complicated behavior and provide a convenient graphical user interface for the end-customer.
  • Device drivers – Many hardware extensions require device drivers for the robot program to communicate with the hardware extension or an extension might require a daemon process to run on the robot. The URCaps software platform provides a generic way to install and run a daemon process.
  • Real-Time Data Exchange (RTDE) – Reliably exchange data between UR robot controller and third party process to implement hierarchical control loops or monitoring software. Request specific robot state data (incl. registers) to be output at a specified rate. Input custom data (e.g. setpoints) through registers and use it in your program. Streaming setup is on a per connection basis and watchdogs are available to guard the input connection status.

    Custom program node

    Figure 2: Custom program node

  • XML-RPC – XML encoded Remote Procedure Calls for URScript. The RPCs almost look like calling normal URScript functions, e.g. camera.get_next_object_coordinates(“bolt”, “M8”). However, RPCs are executed by a different software stack than the UR controller, which might be on a different computer and might have different software installed (e.g. image processing libraries). Any type used by URScript can be transmitted as an argument or received as return value. Due to their simple usage and flexibility the RPCs are great for configuration and service calls, e.g. to setup a device, do a computationally intensive calculation or combine URScript with other software packages.

 

The main benefits for third-party developers are:

  • Easy integration & standardization – Seamless integration with the “normal” PolyScope workflow. No more “hacking” around to integrate software.
  • Better support possibilities – The URCaps software platform provides among others explicit interfaces and versioning. Configuration mistakes can be detected during programming time and simple coding errors are eliminated.
  • Sales / Marketing platform – Well-made URCaps can be featured on the Universal Robots+ website. Distributors and integrators can present accessories, which run successfully at end users, to each other.
  • New opportunities – New opportunities for integrating products and services, and tapping into new information sources.

The main benefits for end-customers / integrators are:

  • Lower installation time – By providing easy-to-use graphical user interfaces the technical expertise required to install hardware is lowered. Furthermore, less time is needed for installation & setup, since the graphical interface can guide one through to process and provide real-time feedback.
  • Faster programming – By lowering the threshold for configuring and programming robots, URCaps create a potential for highly reconfigurable solutions in the production environment. The robot becomes a general purpose tool rather than a highly specialized setup.
  • Lower project risk – URCaps featured on the Universal Robots+ website have run successfully at end users and are dedicated to UR robots.
  • Shorter lead time to implement robot applications – Access to well-proven technology will help to realize new automation solutions quicker.

The following two videos are made in cooperation with On Robot ApS and Robotiq. The videos show how the lower installation time and faster programming for their end-customers has been achieved with the URCaps software platform. At the end of each video the benefits for the end-customer and for URCaps developers are listed separately.

Read More →