`
`
`Authors: Nancy C. Porter (V), J. Allan Cote (V), Timothy D. Gifford (V), and Wim Lam (V)
`
`
`ABSTRACT
`
`
`Based on the orientation and travel speed of a welding torch, virtual reality technology simulates gas metal arc
`welding in near-real time using a neural network.
`
`
`
`
`KEY WORDS: Virtual Reality; Welder Training; Gas Metal
`Arc Welding; Arc Welding; Virtual Reality Welding; Neural
`Network, Welding.
`
`INTRODUCTION
`
`Welding is the key technology for fabrication and assembly of metal
`structures. In some industries, notably automotive, welding is done
`primarily by robots or automated machinery. In many industries such
`as shipbuilding, heavy equipment production, and small parts
`fabrication, welding is a value added operation that is a largely
`manual process applied by humans. Training of new welders is a
`significant activity both for industry and for the vocational education
`community. Training is especially important for welders working on
`critical items such as pressure vessels, nuclear piping, and naval ships,
`where welds have high quality requirements and are carefully
`inspected. It is estimated that the combined annual welder training
`costs for all U.S. shipyards is in excess of $5M. There is active
`interest among naval shipbuilders to reduce welder training costs
`(Boutwell 2002).
`
`Virtual reality (VR) is currently used as a training tool in a number of
`different application areas
`including medicine, aviation,
`law
`enforcement, and the military (Waterworth 1998). For example, VR
`simulations with mixed reality haptic displays are routinely used for
`training surgeons in laparoscopic techniques and commercial pilots
`train almost exclusively on flight simulator platforms. Logically, VR
`technology training should improve the effectiveness of welder
`training.
`
`Toward this end, a VR welding trainer was developed for General
`Dynamics Electric Boat (Porter July 2004) with funding provided by
`the U.S. Navy’s Manufacturing Technology (ManTech) Program
`under contract to the Navy Joining Center of Excellence operated by
`Edison Welding Institute (EWI). This paper is a summary of the
`work performed to date.
`
`SUMMARY
`
` prototype mixed reality system was created to allow a human to
`make a virtual gas metal arc fillet weld in the horizontal welding
`position. The system records process parameters that are displayed
`after welding for critique and instruction. This represents a first of its
`kind welder training approach, which leverages current state-of-the-
`art virtual reality (VR) technology that is capable of integration into
`the product simulation/product-centered manufacturing approaches
`currently being developed and applied by General Dynamics Electric
`Boat (GDEB).
`
`
` A
`
`The technical team consists of EWI, GDEB, and VRSim. EWI is
`North America’s largest engineering and technology organization
`dedicated to welding and materials joining. EWI’s staff provides
`materials joining assistance, contract research, consulting services,
`and training to over 3,300 member company locations representing
`world-class leaders in aerospace, automotive, defense, energy, heavy
`manufacturing, medical, and the electronics industries. GDEB has
`established standards of excellence in the design, construction and
`lifecycle support of submarines for the U.S. Navy. VRSim is a leader
`in
`the design and development of
`interactive virtual reality
`simulations, content development, systems
`integration, design,
`development/implementation of virtual reality simulations, and
`visualization applications.
`
`The first year of project work produced an innovative VR welder
`training system keyed to shipbuilding that is easily adaptable to
`related defense manufacturing applications. The second year of work
`is aimed at enhancing simulation fidelity.
`
`SIMULATION DEFINITION
`
`The foundation of the technical approach was the identification of the
`welding process to simulate. The project team selected the gas metal
`arc welding (GMAW) process. Fig. 1 is a schematic of GMAW
`showing the distinguishing characteristics of the process.
`
`
`
`Fig. 1 - GMAW Process Schematic
`
`Paper No. 2005-P19
`
`Porter
`
`Page number 1
`
`
`
`Work Plane
`
`Work Angle
`
`
`
`Fig. 3 –Work Angle
`
`
`
`Push Angle
`
`Drag Angle
`
`Travel Plane
`
`Axis of Weld
`
`Travel Direction
`
`Fig. 4 –Travel Angle
`
`Using the GDEB welding parameters, a design of experiments (DOE)
`approach was applied to create the welding parameter matrix in
`Table A1, which contains forty-three unique parameter combinations.
`These welding parameter combinations produced a number of
`different bead profile characteristics. In the horizontal welding
`position, sample welds were made for each of the forty-three weld
`parameter combinations. The samples were then cut in cross section
`and measured to record the physical characteristics of each weld bead.
`The features of interest were: weld penetration, bead size, bead shape,
`undercut, porosity, etc. For each weld sample, using the length to
`pixel ratio, the length of a dimension was calculated in millimeters (as
`shown in Fig. 5). In addition, for each sample weld, width
`dimensions were also measured at standard heights as shown in
`Fig. 6.
`
`Video and audio data were also captured from VHS recordings of the
`GMAW process. The audio recordings were used to create a digital
`sound track for the simulation that includes both random variations
`and contact tip-to-work distance responses.
`
`All physical characteristic data acquired from sample welding was
`used to train a neural network for use in the simulation. The neural
`network software then produced a predictive method for weld bead
`shape based on welding parameter input and the eight standard width
`measurements (Fig. 7). The neural network then output this
`predictive algorithm in C code, which was used for the requisite
`simulations. With this method, the weld bead profile was predicted
`given real time inputs from the virtual welding system.
`
`
`
`The next step was to select a weld joint and type based on the training
`practices of the GDEB Welding School. A T-joint with a one pass,
`12-inch long, horizontal fillet weld (Fig. 2) was selected for the
`simulation scenario.
`
`Tee Joint
`
`Fillet Weld
`
`
`
`
`Fig. 2 - Weld Joint Selected for Simulation
`
`The year one virtual reality welder training device required population
`with data that defined the weld parameters generated by the virtual
`user. For example, the torch angles and contact tip-to-work distance
`affects the weld bead profile. Therefore, the simulator must know
`how variations in these parameters affect the weld bead geometry
`(i.e., profile). Relationships were empirically defined between
`welding parameters (inputs) and weld profile (outputs) by:
`•
`Establishing a range of welding parameters
`• Generating weld samples within the parameter ranges
`• Generating predictive methodology
`•
`Providing input to welding simulator on affect of welding
`parameters on weld bead shape
`
`
`GDEB provided the following weld parameters as representative of
`their welding procedure specifications:
`•
`Filler Wire: 0.045" diameter MIL 70S-3 (AWS ER70S-3)
`• Wire Feed Speed Range: 250 to 350 inches per minute
`• Voltage Range: 24 to 28
`• Amperage Range: 200 to 260
`•
`Shielding Gas: 95% Argon with 5% Carbon Dioxide, flow rate
`set at 45 cubic feet per hour
`• Work Angle: 35 - 55 degrees
`•
`Travel Angle: 70 - 110 degrees
`
`Most of the welding parameters listed above are intuitively obvious;
`however, work angle and travel angle are not. The work angle for a
`T-joint is the angle of the torch in relationship to the perpendicular
`faces of the tee joint as illustrated in Fig. 3. The travel angle for a T-
`joint is the angle of the torch tip in relationship to the travel direction
`as shown in Fig. 4.
`
`Paper No. 2005-P19
`
`Porter
`
`Page number 2
`
`
`
`
`Fig. 5 – Pixel Based Weld Cross Section Measurements
`
`
`
`
`Measured from the Base:
`H1 = 0.125” (3.2 mm)
`H2 = 0.250” (6.4 mm)
`H3 = 0.375” (9.5 mm)
`H4 = 0.500” (12.7 mm)
`H5 = 0.625” (15.9 mm)
`H6 = 0.750” (19.1 mm)
`H7 = 0.875” (22.2 mm)
`
`H
`
`H7
`H6
`H5
`H4
`H3
`H2
`H1
`
`W7
`W6
`W5
`W4
`W3
`W2
`W1
`
`W
`
`Fig. 6 – Weld Cross Section Profile Measurements
`
`
`
`Paper No. 2005-P19
`
`Porter
`
`Page number 3
`
`
`
`
`
`10
`
`0123456789
`
`Predicted Values
`
`
`
`-1
`
`0
`
`-1
`
`1
`
`2
`
`3
`
`4
`
`5
`
`6
`
`7
`
`8
`
`9
`
`10
`
`Desired Values
`
`W1 W2 W3 W4 W5 W6 W7
`
`
`Fig. 7 - Neural Net Weld Bead Width Predictions for Fig. 6 Height Locations
`
`require a high level of detail. The model is based on sound,
`comprehensive physics models including multiphase flow, turbulence,
`and moving mesh. For stress and distortion analysis, a fluid-structure
`interaction is being used in ABAQUS.
`
`The weld pool model has three inputs: materials properties; heat from
`arc droplets; and welding parameters. Materials properties of interest
`are thermal conductivity, latent heat, etc. The heat from arc droplets
`will be calculated using empirical equations. Arc energy will be
`modeled as a surface heat flux having a Gaussian distribution (as
`shown in Fig. 8). Droplet energy will be modeled as a cylindrical heat
`source underneath the arc (as shown in Fig. 9).
`
`
`F
`
`r
`Fig. 8 - Arc Energy as Surface Heat Flux
`
`
`
`
`
`The second year of work is aimed at enhancing simulation fidelity and
`developing a curriculum to implement the VR welding trainer at
`GDEB. Toward that end, the current system was assessed with respect
`to GDEB's training needs and the technical advancements that were
`achievable within the year two budget. The weld pool graphics will be
`improved by moving from a neural net trained solely with the
`measurements of physical welds to a neural net trained by a
`combination of numerical modeling and physical measurements. Weld
`pool graphics based on a numerical model will be vastly superior to
`weld pool graphics based on a finite number of welded samples. A
`detailed curriculum will also be developed to exploit the VR welding
`trainer to its maximum potential. This will give GDEB a clear
`implementation plan for the system and will also allow VRSim to move
`to market with a self-contained GMAW training simulator.
`
`The objective of the numerical modeling is to develop a weld pool
`model to predict weld bead geometry. The model will then be
`validated by comparing numerical predictions with experimental welds.
`
`The weld pool model under development is based on FLUENT, a
`commercial computational fluid dynamics code that features an
`unstructured mesh that is excellent for handling complex joint
`geometry with a combination of mesh sizes to minimize computing
`time. A fine mesh is used in the weld pool area which requires a high
`level of detail. A coarse mesh is used in surrounding areas that do not
`
`Paper No. 2005-P19
`
`Porter
`
`Page number 4
`
`
`
`
`
`H
`
`
`
`d
`
`
`Fig. 12 - Longitudinal Cross Section of Modeled Weld
`
`
`
`
`
`Using parameters from the original test matrix (Table A1), welds #40
`and #41 were modeled, and then compared to the photos of actual
`welds produced by the same parameters.
`
`Fig. 13 is the cross-section photo of weld sample #40 with the model
`predicted cross section indicated by the dotted red line.
`
`
`
`
`Fig. 13 - Weld #40 with Model Predicted Weld Outline
`
`Fig. 14 is the cross-section photo of weld sample #41 with the model
`predicted cross section indicated by the dotted red line.
`
`
`
`
`
`
`
`Fig. 9 - Droplet Energy as Cylindrical Heat Source
`
`Welding parameters from the original test matrix (Table A1) were used
`in the early development stages, as the modeling results could be easily
`compared to existing weld samples.
`
`Fig. 10 shows the temperature gradient of a modeled GMAW fillet
`weld. Fig. 11 is a close up view of the same weld which contains tiny
`arrows that indicate the direction of fluid flow and the temperature
`gradient within the modeled weld pool. Fig. 12 is a cross section along
`the longitudinal plane of the modeled weld center line. Here again, tiny
`arrows indicate the direction of fluid flow in the molten weld pool, as
`well as, temperature gradients in the unaffected base metal, the weld
`pool, the recently solidified weld, the heat-affected zone, and base
`metal under the weld.
`
`
`I = 238 A
`V = 27 V
`TS = 6.4 mm/s
`WFS = 0.12
`
`
`Fig. 10 - Temperature Gradient on Modeled Weld
`
`
`I = 238 A
`V = 27 V
`TS = 6.4 mm/s
`WFS = 0.12
`
`
`Fig. 11 - Modeled Weld Pool
`
`
`
`
`
`Paper No. 2005-P19
`
`Porter
`
`Page number 5
`
`
`
`• Welding a Real T-Joint
`
`The ultimate goal of this curriculum is to enable an entry level welder
`to produce a real T-joint after being trained on the simulator. After the
`curriculum is developed, the simulator can be sold as a stand alone
`training system for GMAW fillet welding.
`
`SYSTEM DESIGN
`
`The system is comprised of many commercial off-the-shelf (COTS)
`hardware and software components. The hardware consists of a real
`GMAW welding torch attached to a force feedback (i.e., haptic) device,
`a head-mounted display, a 6 degree of freedom (DOF) tracking system
`(for both the torch and the user’s head), three computer processors, and
`external audio speakers. The software consists of EndeaVR® software
`framework, NeuralWorks® Professional II/PLUS neural network, and
`HapticMASTER software.
`
`The EVR IO module manages the input and output with the user. Input
`signals come from multiple sources as shown in the system architecture
`of Fig. 15.
`
`
`
`Fig. 15 – System Architecture
`
`Fig. 16 is an illustration of the system hardware schematic.
`
`
`
`
`Fig. 16 – System Hardware Schematic
`
`
`
`
`
`Fig. 14 - Weld #41 with Model Predicted Weld Outline
`
`In both Fig. 13 and Fig. 14, the predicted weld bead shape is nearly
`identical to the actual weld bead. Based on these promising results,
`weld pool modeling development continues and is nearing completion.
`
`When the model is complete, it will be used to train a neural net which
`will then output a predictive algorithm in C code for the improved
`simulation.
`
`Three specific characteristics of the simulation that were identified for
`further improvement can not be modeled within the budget constraints
`of year two. Therefore, actual physical measurements are necessary to
`capture the transition from slow to fast travel speed, changes in contact
`tip-to-work distance, and minor torch weaving (both side-to-side and
`forward-to-backward movement). Digital video and audio recordings
`were required to capture this data in its entirety. Table B1 contains all
`the welding parameters that were used to produce the welds for this
`task which was recently completed.
`
` A
`
` curriculum is needed to fully exploit the benefits of the VR welding
`trainer. EWI is currently developing a comprehensive curriculum
`composed of the following modules:
`• Overview
`•
`Introduction
`•
`Safety
`• GMAW Fundamentals
`•
`Equipment and Safety
`•
`Equipment Set-Up
`•
`Process Adjustment
`•
`Simulator System Set-Up and Use
`•
`Level 1 Welding Competency
`•
`Level 2 Welding Competency
`•
`Level 3 Welding Competency
`
`Paper No. 2005-P19
`
`Porter
`
`Page number 6
`
`
`
`
`
`
`
`
`Fig. 18 - OXO Welding Torch
`
`The welding torch is connected to a three degree of freedom haptic
`device, the FCS HapticMASTER (Fig. 19). The capabilities of the
`HapticMASTER were extended through the development of a gimbal
`(Fig. 20), which attaches the welding torch to the device. This gimbal
`allows the torch to rotate through all likely angles of normal use.
`Translational forces are applied at the tip of the welding torch. The
`HapticMASTER has a work envelope of ~1 m3 and a force resolution
`of ~1 gm. As the simulator components came together, the design was
`refined to reflect issues apparent through experimentation.
`
`Fig. 19 - HapticMASTER System
`
`
`
`
`
`
`
`As depicted in Fig. 17, the head and welding torch are tracked by an
`overhead ultrasonic tracking system (Intersense IS-900 PCTracker) and
`torch orientation
`is
`redundantly
`tracked by a haptic device
`(HapticMASTER).
` Visual display is provided through a fully
`occluding head-mounted display (Olympus Eye-Trek FMD 250W).
`
`
`
`
`Fig. 17 – Head and Torch Input Sources
`
`The welding torch trigger generates an on/off signal to produce the
`virtual bead. Torch orientation and travel speed are output to a neural
`network, which returns the characteristics of the weld cross section and
`dynamically generates an object representing a weld bead. The shape
`data is fed to the haptic display software, which determines collisions
`(with the T-joint and the surrounding environment) and forces to be
`transmitted to the welding torch. Shape information is also fed to the
`visualization software, which displays the weld bead both in its
`superheated state and as it cools after the welding arc/torch progresses
`along the joint.
`
`three simultaneous
`the GMAW process,
`To accurately depict
`simulations of force, simulation, and visualization are necessary. These
`models run simultaneously on three networked computers using a
`neural network. In other words, the "welding" simulation runs in real-
`time in parallel with the force (i.e., haptic) and visualization simulation
`models.
`
`HARDWARE
`
`The system features a real GMAW welding torch and attached cable.
`The 400 Amp OXO MIG gun (part number APX4015AC1EM 3545)
`was selected, as this is the welding torch used at the GDEB Welding
`School (see Fig. 18).
`
`Paper No. 2005-P19
`
`Porter
`
`Page number 7
`
`
`
`
`Anti-backlash lead screw spindles are mounted with a flexible coupling
`to the axis of a direct current brushed motor. This backlash-free
`solution introduces some friction in the mechanism; however, the
`friction is completely eliminated by the control loop, up to the accuracy
`of the force sensor. The result in force control mode is a backlash, stick
`slip-free and smooth motion at the end effector. A sensitive strain
`gauge force sensor is located directly behind the end effector. The
`interaction force is measured as close to the human hand as possible to
`avoid distortion of
`the force signal and
`to optimize system
`performance.
`
`Exchangeable end effectors can be mounted to the force sensor to
`match the end application. For instance, with an end effector with three
`measured rotations in which tools can be clamped (e.g., medical
`instruments or screwdrivers); the tool contact forces can be simulated in
`virtual settings. Another example is an end effector in which a human
`arm can be placed. The gimbal end effector has one active and two
`passive degrees of freedom. With this set-up, robot therapy in a virtual
`setting is given to patients who have suffered strokes (Amirabdollahian
`2002). The advantage of robot therapy is that the level of assistance
`can be accurately adapted to the patient and any improvements can be
`measured.
`
`Visual display is provided through a fully occluding head-mounted
`display (Olympus Eye-Trek FMD 250W wide screen glasses) with
`resolution of 800x600 for each eye. The display signal is converted
`from a digital raster image to National Television System Committee
`(NTSC) video with a scan converter and is presented as a single
`monoscopic image.
`
`The motion of the welding torch and the welder’s head (i.e., view) is
`tracked using a commercial ultrasonic six degree of freedom tracking
`system produced by Intersense Corporation (IS-900 PCTracker). The
`angular orientation of the welding torch is also tracked using a three-
`axis measurement gimbal at the attachment of the torch to the haptic
`device. Audio display of welding sounds is provided through desktop
`speakers.
`
`The year two system will include the Silicon Graphics Prism
`visualization system operating on a 64-bit open system Linux®
`environment with two ATI® graphics pipes and will feature four Intel®
`Itanium® 2 processors.
`
`SOFTWARE
`
`The system uses VRSim's EndeaVR® software framework to control
`the simulations. EndeaVR® is a suite of products that allows the user
`to interact with CAD files in real-time simulations. Each model is
`represented with a high level of visual and interactive fidelity.
`EndeaVR® allows for the testing of digital mock-ups to determine
`accessibility of maintenance, as well as, positioning with other
`components. EndeaVR® also allows for collaborative interactions
`between multiple users in multiple sites. The EndeaVR® software
`framework is built on a series of components bringing rapid
`development time to the user.
`
`Components of the EndeaVR® software framework:
`•
`Scene Builder: component behavioral control to automatically
`assign characteristics to imported parts.
`Easy Objects: preconstructed tools and components for plug and
`play use within the simulation.
`
`•
`
`
`
`
`Fig. 20 – Torch Gimbal
`
`The HapticMASTER was selected primarily due to its range of motion
`and capability
`to hold
`the weight of
`the welding
`torch.
`HapticMASTER hardware comprises two main functional components:
`the robot arm and the control unit. The robot serves as the actual force
`display, whereas the control unit houses the electronics such as
`amplifiers, the safety relay, and the haptic server.
`
`The mechanism of the robot arm is built for zero backlash and minimal
`weight. Zero backlash is a requirement because the human sense is
`extremely sensitive to vibration effects due to play. The human tactile
`senses have a spatial resolution of up to 10-100 microns for vibration
`(Burdea 1996). Minimal weight is a requirement for safety aspects.
`Both the speed and the mass of the robot arm determine its energy
`content in collision with the T-joint and other components in the
`simulated welding booth. The speed is set to the value of a normal
`human arm motion (2.2 m/s), and a lightweight aluminum tubing
`construction minimizes the mass of the robot arm. The kinematic chain
`from the bottom up yields base rotation, arm up/down, and arm in/out,
`which gives three degrees of freedom at the end effector. A volumetric
`workspace is created (Fig. 21) which is large enough to enclose most
`human single-handed or double-handed tasks.
`
`
`
`Fig. 21 - HapticMASTER Workspace
`
`
`
`Paper No. 2005-P19
`
`Porter
`
`Page number 8
`
`
`
`• VRmath: provides fast high level calculations to enhance the
`dynamics of the real time simulation.
`
`to
`
`its
`
`
`EndeaVR® delivers the simulation through traditional interfaces, as
`well as, with full body tracking and head mounted displays. Multiple
`users can also interface with the system with both hands for a natural
`assessment of tasks incorporating digital parts.
`
`The neural net trained by EWI is the NeuralWorks® Professional
`II/PLUS, a state-of-the-art software tool for rapidly creating and
`deploying prediction and classification applications with a proven
`record of usage for various welding applications.
`
`NeuralWorks® Professional II/PLUS was selected due
`performance in the following areas:
`• Graphical interface is user friendly.
`• Major network
`to choose
`types are supported (necessary
`appropriate neural network topology for adequate and accurate
`modeling of welding specific applications).
`• Modification and adjustment of the network architecture is very
`flexible (this functionality is absolutely necessary since extension
`of the network architecture will be needed to include additional
`welding process variables in year two of this project).
`Trained network has excellent deployment capability (required to
`generate standard C/Fortran code to deploy and integrate trained
`network into simulator hardware).
`
`•
`
`
`The HapticMASTER is controlled by a software module provided by
`FCS, which integrates with the I/O module (EVR IO in Fig. 15). The
`HapticMASTER’s virtual model is rendered by a dedicated personal
`computer with a VxWorks real-time operating system (called the haptic
`server), which runs at a fixed 2,500 Hz refresh rate. This frequency is
`assumed to be high enough to guarantee haptic quality for a smooth and
`realistic experience, since it is approximately ten times higher than the
`maximal human discrepancy value (Amirabdollahian 2002). Finally,
`the proportional integration and derivative (PID) motor control loop
`runs on the amplifiers (located in the control box) at a 20 kHz pulse
`width modulated frequency.
`
`With an application programming interface (API), the user creates the
`virtual model on the haptic server. The real-time operating system on
`the haptic server interprets the virtual model and generates the
`trajectories for the robot, based on the force sensor input. The haptic
`server also incorporates issues like safety guards, communication
`protocols, and collision detection with virtual objects. The HapticAPI,
`which is a C++ programming interface, is used to make an Ethernet
`connection to the HapticMASTER to control the internal state machine
`and to define or modify the virtual haptic world. Haptic effects can be
`created (like dampers and springs), and spatial geometric primitives can
`be defined (like spheres, cones and cubes). Simple virtual worlds can
`be created using these effects and primitives. When more complex
`virtual worlds are required, e.g. with meshed surfaces or deformation,
`another rendering method needs to be applied. A local mass model will
`be rendered on the haptic server, and the forces acting on this mass due
`to interaction with the virtual world are rendered from a host PC.
`When the end effector collides with a virtual object, an appropriate
`force and displacement are presented to the user. The relationship
`between force and displacement is given by the object properties of the
`virtual model (e.g. stiffness, damping, friction, etc.). Using a penalty-
`based method, the appropriate relation between force and displacement
`is calculated by the real-time operating system and incorporated in the
`position, velocity, and acceleration (PVA) signal.
`
`
`The HapticMASTER uses an admittance control algorithm. A force
`sensor measures the interaction force between the user and the system.
`From these forces a virtual mass model calculates the PVA, which an
`object touched in the virtual world would obtain as a result of this
`force.
`
`The virtual world defines the space in which the object lives (e.g.
`gravity, environmental friction, position of the object, etc.) and the
`object properties (e.g. mass, stiffness, damping, friction, etc.). The
`virtual mass model will typically contain a mass larger than zero, to
`avoid commanding infinite accelerations and cause system instabilities.
`The PVA-vector serves as a reference signal for the robot, realized by a
`PID servo control servo loop.
`
`With proper feedback gain settings, this control loop will compensate
`for the real mass of the manipulator up to a factor of six, and terminates
`its internal friction up to the accuracy of the force sensor. Therefore, if
`the mass of the manipulator behind the actuator is 15 kg, the operator
`feels only 2.5 kg at the end effector. Since gravity can also be
`eliminated (if desired), it does not feel like moving 2.5 kg.
`
`SIMULATOR OPERATION
`
`The system is controlled by VRSim's EndeaVR® virtual reality
`operating system that manages all aspects of the simulation. The
`simulation software ties into the underlying operating system. The
`EndeaVR® simulation module (EVR SIM in Fig. 15) handles all
`aspects of software behavior control of virtual objects. The EndeaVR®
`I/O module (EVR IO in Fig. 15) manages the input and output with the
`user. This includes input from the tracking system, which gives the
`system the position and orientation of the users head, as well as,
`maintaining the calibration of the relative positions of the head
`mounted display (HMD) and the HapticMASTER. The EVR IO also
`creates the output to the HMD.
`
`The torch trigger feeds an on/off signal that generates the virtual bead.
`The torch orientation and travel speed are output to the neural net,
`which returns the geometric characteristics that dynamically generate
`weld bead shape based on cross section predictions. The shape data is
`also fed to the haptic display software, which determines collisions
`with the weld test piece (and the resultant forces) to be transmitted to
`the torch. Shape information is also fed to the visualization software,
`which displays the bead both in its superheated state and as it cools
`behind the travel path of the welding arc.
`
`The welding simulation is currently based on empirical results from the
`detailed analysis of the forty-three test welds and is generated by three
`computers running simultaneous virtual models of force, simulation
`and visualization.
`
`In real-time, the neural net software communicates directly with the
`simulation module (EVR SIM). The EVR SIM module calculates the
`position, orientation and speed of the welding torch based on
`information from the tracking devices. These parameters are fed to the
`neural net software, which determines the cross sectional shape of the
`weld bead.
`
`Visualizations of the weld bead and the fluid dynamic properties were
`created in the simulation module (EVR SIM). The HapticMASTER is
`controlled by a software module provided by FCS, which integrates
`with the I/O module (EVR IO). It also interacts directly with the
`simulation module (EVR SIM) based on needs of the forces depicted in
`the simulation.
`
`Paper No. 2005-P19
`
`Porter
`
`Page number 9
`
`
`
`Thirty-eight percent of simulator evaluators identified themselves as
`beginners (40 persons); 17% (or 18 persons) were of an intermediate
`level of experience, 25% (or 26 persons) were advanced, and 18% (or
`19 persons) considered themselves masters of the GMAW process.
`
`When asked to rate the simulator, the majority of users rated the
`welding simulator as "good," "great" or "excellent” (Fig. 26).
`
`When asked to rate the simulator as a tool for training new welders, the
`majority of survey respondents (63%) indicated that the simulator
`would be a useful tool in training welders (Fig. 27).
`
`CONCLUSIONS
`
` A
`
` new welder training approach was developed that leverages the
`current state-of-the-art virtual reality technology for integration into the
`product simulation and product-centered manufacturing approaches
`currently being developed and applied by GDEB (Porter July 2004).
`This project developed and demonstrated a mixed reality system that
`simulates the GMAW welding process. The system provides a
`reasonably realistic experience of the actual welding process wherein a
`user holds a real welding torch while seeing and hearing a virtual weld
`bead created with satisfactorily predicted quality. The system is
`currently under development to further refine the visual, audio, and
`haptic fidelity (Porter May 2004).
`
`COMMERCIALIZATION ACTIVITIES
`
`On April 27, 2005, VRSim and Silicon Graphics (SGI) announced the
`commercial availability of the Virtual Reality Welding Trainer (Loskot
`2005). The system includes the Silicon Graphics Prism visualization
`system operating on the 64-bit open system Linux® environment with
`four Intel® Itanium® 2 processors, two ATI® graphics pipes, VRSim
`Welding Software suite, FCS Robotics HapticMASTER System,
`motion-tracking head-mount display, and Instruction Console. The
`system is scalable to multiple training booths per Instructor Station.
`
`ACKNOWLEDGEMENTS
`
`The project team acknowledges the significant contributions of the
`following individuals: Scott Lovell and Matt Bennett of VRSim; Mike
`Gendron and Gordon Gendron of the GDEB Quonset Point Welding
`School; Ken Peters of GDEB IPDE; Jim Reynolds, Chris Conrardy, and
`Randy Dull of EWI; and Luke Bailey of the Hobart Institute of
`Welding Technology. Special thanks go to Sal Lamesa and Ken Fast of
`GDEB who diligently ran 100+ users through the simulator evaluation
`at GDEB Quonset Point and collected survey responses.
`
`
`
`All of the system components work together to give the user the
`sensation of real GMA welding (with realistic weight, feel, audio and
`visuals).
`
`The simulated visual environment features the sparse industrial setting
`of a welding booth (Fig. 22) with a work table and welding test piece
`(i.e., tack welded T-joint ready for welding).
`
`
`
`
`
`Fig. 22 – Simulated Welding Booth
`
`When the user strikes an arc (i.e., when the torch trigger is depressed),
`the view goes dark, illuminated only by the arc, simulating the view
`through a real welding