This project has received funding from the European Union's Horizon 2020 research and innovation programme under grant agreement No 825196 Jyrki Latokartano Tampere University Indumation, Wed 6th Feb 2019, Kortrijk, Belgium
This project has received funding fromthe European Union's Horizon 2020research and innovation programmeunder grant agreement No 825196
Jyrki Latokartano
Tampere University
Indumation, Wed 6th Feb 2019, Kortrijk, Belgium
TRINITY: The State of the Art of Industrial Robots : applications, capabilities and challenges in 2019 and beyond. Introducing the European support hub TRINITY for applying robots in small and medium sized manufacturing companies.
Jyrki Latokartano
Tampere University
Indumation expert classes, Wednesday 6th Feb. 2019, Kortrijk, Belgium
DT-ICT-02-2018 Robotics - Digital Innovation Hubs (DIH)
H2020 Innovation Action (IA)
Distribution of Manufacturing in EU
2030
Order DeliveryValue-added time 1-5%
• Increase of competitiveness over the supply network
• Reduce the non-value added time
• Increase the quality and production capacity by robotics, and supply network transparency and reliable real-time data visibility by ICT and AI
Competitiveness
• Find&Use the right analytical methods in right time and for the right purpose
• Improve the input data (collection, filtering, harmonization)
• Creation of digital information flow (concrete examples) to increase transparency and traceability
• Ensure safety and trust
Digitalization
• Support the forming of temporal ecosystems among SMEs and Large companies
• Find the experts and share competences more efficiently in future
• Increase the trust among companies to form alliancesEcosystems
• Mitigate the problem with shortage of engineers by 2030
• Skills gap between high skilled and low skilled persons is wideningSkills gap
TRINITY DIHDigital Technologies, Advanced Robotics and increased Cyber-security for Agile Production in Future European Manufacturing Ecosystems
WHY DIHs?
Ref: ManuFuture 2030
H2020-DT-2018-2020 DIGITISING AND TRANSFORMING EUROPEAN INDUSTRY
AND SERVICES: DIGITAL INNOVATION HUBS AND PLATFORMS
• TRINITY - Digital Technologies, Advanced Robotics and increased Cyber-security for Agile Production in Future European Manufacturing Ecosystems
• DIH² - A Pan‐European Network of Robotics DIHs for Agile Production
• QU4LITY - Digital Reality in Zero Defect Manufacturing
• DIHNET.EU - Next Generation European DIH Network
• RIMA - Robotics for Infrastructure Inspection and Maintenance
• DIH-HERO - Digital Innovation Hubs in Healthcare Robotics
• RODIN - RObotics Digital Innovation Network
https://www.eu-robotics.net/sparc/upload/Newsroom/Press/2017/DIH_in_Agile_Production_FINAL.pdf
TRINITY: Agility for production in EuropeTechnical pillars
Standards
TRINITY consortium• Coordinator: Tampere University of Technology (TUT), Finland
• Research partners: Centria (Finland), UiT, (Norway), JSI (Slovenia), LMS (Greece), BME (Hungary), FhG (Germany), Flanders Make (Belgium), EDI (Latvia)
• Companies: Fastems (Finland), LP Montagetechnik (Germany), F6S (Ireland)
• Associations & Clusters: LSEC (Belgium), Civitta (Lithuania), CECIMO (Belgium), DNT (Norway)
Concept and Approach
• TRINITY core partners will prepare 17 modular and re-configurable use-case demonstrations on the field of robotics. Each of the use-case demonstrations include well defined specifications, “how to set up” tutorials and “how to use” education packages.
• Extensive modular use-case will demonstrate the contribution of robotics, artificial intelligence and cybersecurity in terms of improving the agility and performance in manufacturing activities and generating new products and service concepts in the field of robotics
TRINITY Main components: Lead with example – learn by doing
Target: 50 modular use case solution blocks by the end of 2022
• Internal demonstrations• Minimum 16 • presented and documented 2019-2020
• Company demonstrators• over 30 developed 2020-2022• First open call 2020• All trinity pillars covered
Standards
Use case demonstrations 1-4/18
• Use case 1 Engine block assembly with enhanced vision-based safety system (TUT) Goal: To increase utilization of safe robotics in human-robot collaboration. The proof of concept utilizes industrial case product with larger robot systems.
• Use case 2 Human position recognition and feedback with AR technologies in disassembly process (TUT, LMS) Goal: To increase utilization of safe robotics in human-robot collaboration. The proof of concept utilizes industrial case product with larger robot systems.
• Use case 3 Collaborative robotics in large scale assembly (CENT) Goal: Aim is to demonstrate how the robotization of big construction elements assembly can be made more agile with a fenceless robot application by means of kinetic as tracing of people and e.g. AR glasses and voice recognition as UI.
• Use case 4 Integrating digital context (e.g. BIM) and AR to digital twin in a robotized factory (CENT) Goal: Demonstrating how robotized factories, which e.g. make construction elements for block of flats, can make their production more agile by integrating BIM and digital twin with AR.
Use case demonstrations 5-8/18
• Use case 5 Wire arc additive manufacturing with industrial robots (UiT, LSEC) Goal: Increase production rate with additive manufacturing of metal parts
• Use case 6 Production flow simulation/supervision (UiT, LSEC) Goal: Visualization of production, along with distant monitoring/control of production flow
• Use case 7 Robot workcell reconfiguration (JSI) Goal: Increase the flexibility of production by providing the necessary hardware and software toolchain to quickly reconfigure robot workcells for the assembly of new products
• Use case 8 Quick programming and calibration of assembly tasks by kinesthetic teaching (JSI) Goal: The aim is to quickly program and calibrate a robot workcell for a different assembly task. Both task programming and calibration will rely on kinesthetic teaching, which will enable the user to program a new assembly program within a few hours.’
Use case demonstrations 9-11/18
• Use case 9 HRC dynamic task planning & work re-organization (LMS) Goal: Increase production volume of such cell by reducing cycle time. Multi-criteria evaluation and easy customizable workplace based on end user requirements. Evaluate ergonomics issues for human operators’ tasks.
• Use case 10 HRI framework for Operator support application in human robot collaborative operations (LMS) Goal: Increase operators’ safety awareness, enable direct interaction among robot resources and operators by using different technologies. Include human operator in the execution loop by providing him/her with the ability to provide feedback in a central execution controller.
• Use case 11 Robotized serving of automated warehouse (BME) Goal: The goal of this experiment is to demonstrate the feasibility of using mobile robots in intralogistics, transporting anything from raw material to final product between production machines and warehouse or packaging area.
Use case demonstrations 12-14/18
• Use Case 12 Reconfigurable human-robot collaborative work cell for assembly of product variants (MAKE) Goal: Aim is to demonstrate how production systems can be easily reconfigured based on product specifications for assembly of product variants and how adequate work instructions are provide to operators.
• Use case 13 Deployment of mobile robots in collaborative work cell for assembly of product variants (MAKE) Goal: Deployment of mobile collaborative manipulators in shared work places to perform assembly operations.
• Use case 14 Adaptive Manufacturing System (AMS) (FASTEMS) Goal: Development of a new multi machine control system for adaptive and reconfigurable production focusing on integration of robot cells into control systems
Use case demonstrations 15-18/18
• Use case 15 Simulation and evaluation of optimal node distribution, reliability and attack robustness in wireless networks in production environments (FhG) Goal: Increase robustness of wireless networks in production environments, put wireless networks faster into service, improve the simulation models.
• Use case 16 Validation of flexible jigs for agile production (FhG) Goal: Plan, design and test flexible devices for fixing, grasping and assembling in different use cases. This is a basic requirement for (partial) automation in SMEs
• Use case 17 Artificial Intelligence based stereo vision system for object detection, recognition, classification and pick-up by a robotic arm (EDI) Goal: Automation of industrial processes involving large number of objects with unpredictable positions/locations.
• Use case 18: Rapid development, testing and validation of large wireless sensor networks for production environment (EDI) Goal: to decrease time to market
Problem/goal Utilization of safe and intuitive robotics in human-robot collaboration.
Potential users SMEs for novel safety systems and co-bot potential in assembly.
NACE 29.3 Manufacture of parts and accessories for motor vehicles
Description Demonstration of a vision-based safety system for human-robot collaborative assemblyof diesel engine components. A dynamic 3D map of the working environment (robot,components + human) is continuously updated and used for safety and interaction(virtual GUI). This robot working zone is projected onto a flat surface via projection.
Hardware Universal Robots (UR5), Robotiq gripper, Kinect, Intel Realsense, projector
Software Open source software (ROS, MoveIt)
Standards Considered: ISO/TS 15066:2016, ISO 10218-1/2
Possible benefits Studies with collaborative robots, human-robot interaction, pose recognition and handling of complex objects (engine block components), dynamic 3D safety zone in shared workspace
Partners Tampere University (Finland), LMS (Greece), EDI (Latvia)
More info https://www.dropbox.com/s/xlatmas4w6r2rx7/user_studies_grid.mp4?dl=0
Use case 1: Collaborative assembly with vision-based safety system
Projection-based safety zone around robot
Diesel engine components for assembly
Use case 2: Collaborative disassembly with augmented reality interaction
3D Diesel engine model for disassembly
MS hololens for augmented reality interaction
Problem/goal Utilization of human-robot collaboration with larger robots
Potential users SMEs for augmented reality interaction and industrial disassembly
NACE 33.1 Repair of fabricated metal products, machinery and equipment
Description Disassembly of an industrial product. The vision system scans the product andrecognizes its type, position and orientation. The cell control system will make a taskallocation between robot and operator. Operator can see the instructions todisassembly and the robot safety zones in 3D with a MS HoloLens AR headset . Theoperator notifies the robot via gestures. The sensor system is supervising the workspace.
Hardware ABB IRB4600, Kinect, Intel Realsense, DLP projector, MS HoloLens
Software Open source software (ROS, MoveIt)
Standards Considered: ISO/TS 15066:2016
Possible benefits Applications with large robots for disassembly and AR interaction. Object and pose recognition of complex objects (engine block components)
Partners Tampere University (Finland), LMS (Greece), EDI (Latvia)
More info
Problem/goal Utilization of agile human robot collaboration in large scale assembly tasks, such as assembly of a pre-fabricated wall element
Potential users Intergrators of robotic applications and companies carrying out large-scale prefabrication or building component manufacturing
NACE 33.20 Installation of industrial machinery and equipment
Description Demonstration of agile industrial robotization of a large-scale prefabricated wallelement assembly where robots and people will process elements simultaneously. Theworking zone will be monitored dynamically and provided to the worker and robottogether with the task plans and situation aware information. In the use case differentcommunication methods (RF tracking, voice regonition, together wit AR and mobileuser equipment) are evaluated.
Hardware ABB/KUKA robots, Universal Robots (UR3/10), Robotiq gripper, Pilz Safety Eye, 3D Kinect, RF tracking and local positioning systems, LIDARs, Sick S300 safety scanner
Software Commercial (Visual Components/ ABB Robot Studio/ RoboDK), and open source software (ROS)
Standards Considered: ISO/TS 15066:2016, ISO 10218-1/2
Possible benefits Studies with collaborative robots, human-robot interaction, dynamic 3D safety
Partners Centria, Tampere University (Finland), FhG (Germany), UiT (Norway), LMS (Greece)
More info
Use case 3: Collaborative robotics in large scale assembly
Agile large-scale prefabrication can benefit from collaborative robotics
BIM, digital twins and AR/VR (e.g. MS Hololens) can be utililized in agile production
Problem/goal Utilization of digital context and digital twins for the robotized production with AR/VR
Potential users Intergrators of industrial robotic applications, manufacturing companies and SMEs providing or utilizing augmented reality interaction
NACE 28.29 Manufacture of other general-purpose machinery
Description Demonstrate how companies carrying out prefabrication can utilize robotizedmanufacturing to get their production more agile by integrating BIM, digital twin andVR/AR technology. They can utilize these agile concepts for more flexible monitoring,operational support, training, safety and maintenance purposes of the production cell.
Hardware ABB/KUKA/UR robots, MS HoloLens, HTC Vive, 3D Kinect, LIDARs, NDI Optotrack, Leica long range scanner, SICK encoders
Software Commercial (Dassault 3DExperience, Visual Components/ ABB Robot Studio/ RoboDK)and open source software (Unity, Vuforia, Blender, ROS, Linux)
Standards Considered: ISO/TS 15066:2016
Possible benefits Studies with digital twins, BIM and AR/VR technology for collaborative robotics in industrial environments for better human-robot interaction, and dynamic 3D safety
Partners Centria, Tampere University (Finland), FhG (Germany), UiT (Norway)
More info
Use case 4: Integrating digital context (e.g. BIM) to the digital twin with AR/VR of the robotized production
Use case 7: Robot workcell reconfigurationProblem/goal Quick robot aided reconfiguration of different assembly processes
Potential users SMEs doing high mix assembly tasks
NACE 29.3 Manufacture of parts and accessories for motor vehicles
Description The aim is to provide the economical personalised manufacturing with a new kind ofan autonomous robot workcell, which will be attractive for a changeable few-of-a-kind production. It will be based on innovative hardware reconfigurationtechnologies, ROS-based software solutions and design and reconfigurationsimulation system.
Hardware UR5, novel reconfigurable hardware
Software ROS based software enivironment, simulation tools
Standards Considered: ISO 10218-2:2011
Possible benefits The main feature of the proposed robot workcell will be that it can be partly automatically reconfigured to execute new production tasks efficiently and economically with a minimum amount of human intervention.
Partners Jožef Stefan Institute, Ljubljana, Slovenia
More info
Use case 7: Robot workcell reconfiguration
Jožef Stefan Institute, Ljubljana, Slovenia
Use case 9: Dynamic task planning & work re-organizationProblem/goal Support production designers during the manufacturing system design process
Potential users SMEs that need novel solutions for optimizing their production while automating the design process.
NACE 29.3 Manufacture of parts and accessories for motor vehicles
Description Demonstration of an intelligent decision-making framework for active and passiveresources allocation in a workcell, rough motion planning of human and robotoperations and initial task planning. Multi criteria decision making modules integrating3D graphical representation, simulation and embedded motion planning is used tovalidate alternative workplaces layouts and task plans.
Hardware High performance computer
Software Open source software (ROS), Siemens - Process Simulate
Standards Considered: ISO/TS 15066:2016, ISO 10218-1/2
Possible benefits Minimize the time required as well as the effort for multiple iterations between thedesigners, process engineers and system integrators. The solution will address the issueby gathering in a tool all this knowledge and providing feedback to the human within ashort time frame (some minutes instead of 1-month work).
Partners LMS – University of Patras, Greece
More info https://www.youtube.com/watch?v=0asQ5HYwe2g
Intelligent heuristics integrated with 3D simulation tools
Use case 9: HRI framework for operator support in human robot collaborative operations
Problem/goal Support and increase human operator’s safety feeling during collaborative applications
Potential users SMEs interested on exploiting the synergy effect of humans and robots in assembly
NACE 29.3 Manufacture of parts and accessories for motor vehicles
Description Demonstration of an Augmented Reality (AR) based application providing to the humanoperators: a) assembly instructions, b) robot behaviour information for increasingsafety awareness, c) safe working volumes, d) production status information. Interfaceson smart wearable devices enable the easy and direct human robot interaction whilethe HRC execution is orchestrated and monitored through a service – based controller.
Hardware Industrial robots, , Augmented Reality glasses, Smartwatch
Software Open source software (ROS, RosBridge, ROS Java, Unity, Vuforia)
Standards Considered: ISO/TS 15066:2016, ISO 10218-1/2
Possible benefits Unexperienced operators can be allocated to work in HRC cells and new processes limiting the training requirements thus providing agility in the system on re-allocating human resources according to the production needs
Partners LMS – University of Patras, Greece
More info https://www.youtube.com/watch?v=FsYA26SowVk
HRC assembly cell
AR based application –Operator’s field of view
Use Case 16: Flexible automation for agile productionProblem/goal Plan, design and test flexible devices for fixing, grasping and assembling
Potential users Integrators of wireless networks in IIoT environments, IIoT manufacturers, researcher
NACE C26.1 Manufacture of electronic components and boards
Description Highly flexible solutions for handling and clamping parts during the assembly process areneeded to realize small lot sizes with a high variety. Flexible grippers and jigs are apossible solution. Requirements of different product types must be considered whileplanning, designing and constructing such systems. The main idea is to develop methodsfor planning and designing such tools and jigs. The use case is demonstrated for the LED-lamp production.
Hardware Industrial robot arm, vision system (hardware), gripper
Software Vision system (software)
Standards C# (ISO/IEC 23270:2006); Computer graphics and image processing - The Virtual Reality Modeling Language (ISO/IEC 14772-1:1997; ISO/IEC 14772-2:2004)
Possible benefits
method to identify and rate automation potential of different work places, solution for creating a highly flexible production system for products in small lot sizes and high variety will be shown, summary of design rules for manual work place design
Partners LP-Montagetechnik (Germany)
More info
Problem/goal Automation of industrial processes involving large number of objects with unpredictable positions.
Potential users SMEs willing to optimize the production process by using AI based robotic arms.
NACE C32 - Other manufacturing
Description A lot of industrial processes involve operation with large number of different objects. It is hard toautomate these kinds of processes because sometimes it is impossible to predetermine thepositions for these objects. To overcome this issue, we integrate 3D and 2D computer visionsolutions with AI and robotic systems for object detection, localization and classification
Hardware RealSense, Microsoft Kinect V2, Bumblebee, Proximity sensor, Universal Robots UR5
Software Open source software (ROS, TensorFlow)
Standards Considered: Python, OpenCV
Possible benefits Provided algorithms and methods, which are based on AI, will allow to generate labelled data for various objects a lot faster with reduced amount of manual work allowing faster adaption of system which is capable of randomly dropped object detection, recognition, classification and pick-up by a robotic arm for different scenarios.
Partners EDI (Latvia)
More info https://www.youtube.com/watch?v=aovhtCX4aiM&t
Use case 17: Artificial Intelligence based stereo vision system for object detection, recognition,
classification and pick-up by a robotic arm
What’s in it for me?
Pick the best parts and try it yourself
• Demonstrations available to try for all interested companies
• Implement the blocks that suit you best
• Build own blocks
• Funding available through open calls
TRINITY digital access point
• Access to professionals networks
• Internal demonstrations• Technical descriptions
• Implementation instructions
• Training material
• Training and consultation• Technical
• Financial
• Business planning
• Funding for companies• Open calls 2020 and 2021
www.trinityrobotics.eu
More info:
www.trinityrobotics.eu
@eu_trinity
bit.ly/TRINITYlinkedin
TRINITY opening event
Thank you!Jyrki Latokartano, Project Manager, TRINITY DIH
Tampere UniversityFaculty of Engineering and Natural SciencesMobile: +358-40-706-0122Email: [email protected]: http://www.tuni.fi