Are You Ready to Automate? (April 23, 2020)

Blog 2 - 2020
Automation considerations

If you have spent any time in a major automotive plant, you understand that automation is a critical part of the infrastructure for building vehicles. Robots, conveyers, and other automation equipment are accepted at a level unseen in many industries. This automation helps auto manufacturers be more efficient, building high quality parts at rates that can boggle the mind. Today in some areas of the plant there are hardly any people as robots and automated guided vehicles (AGVs) transfer parts from station to station. Automation is accepted in most operations, but automation is not the first option for many when it comes to metrology and process control. This reluctance to automate has left many quality control strategies in the proverbial dark ages. Applying automation to your metrology processes can help increase measurement throughput, reduce direct labor involved in measurement, and help your overall quality improve. When deciding to automate your metrology process it is helpful to consider a few key questions prior to making the investment:

  1. Can I achieve the accuracy and repeatability necessary with an automated solution?
  2. How fast does my measurement system need to be?
  3. What uptime do I need?
  4. How flexible does my system need to be?

Can I Achieve the Accuracy and Repeatability Necessary?

Before we go into how accuracy and repeatability relate to metrology performance, one must differentiate between the two concepts as they are often misunderstood and misapplied. Simply put, a measurement device is accurate if it provides a reading that can be traced back to an accuracy standard by measuring a target like a sphere or more accurate comparison device. In metrology, accuracy is typically measured in microns, or a millionth of a meter. One example of accuracy to a standard is the tests utilized by Perceptron for our AccuSite system. AccuSite uses the ISO 10360-8 standard, ensuring their measurement device is accurate to 150 microns through the entire measurement volume. Accuracy tests like these are typically done on the plant floor as part of a system buyoff when compared to an acceptable tolerance. When you read the accuracy specification, make sure it is for the entire measurement volume, not just in the system’s sweet spot.

Repeatability is the ability for the measurement device to provide the same result within a certain error, over several measurements. Repeatability is an equally important performance criterion and is subject to the same considerations as accuracy. Automated metrology systems typically have superior repeatability performance relative to manually operated systems due to the lack of operator influence on the measurement results. It is important that you purchase a device that will be repeatable enough to handle the bulk of your metrology tasks.

When trying to determine how accurate your device needs to be, work back from the measurements you are trying to make, and look at the tolerances for those measurements. Remember that higher accuracy devices typically carry a higher price tag and may require more maintenance to keep them as accurate as they were when you first installed them — it may not pay to buy the most accurate device available.

Systems like AccuSite have been designed to provide both high repeatability and micron level accuracy, ensuring that manufacturers do not have to compromise to get the highest accuracy and repeatability in their automated system.

Speed

Speed is essential in automated metrology as it is directly related to measurement throughput, but speed impacts other factors as well. In general, speed can impact the accuracy and repeatability of a measurement system. For example, let us consider a robotic measurement system where a non-contact sensor is mounted to an industrial robot. At slow speeds, the robot will not experience as much backlash, and will produce a better measurement result. Speed the robot up and the measurement quality may degrade significantly. When specifying your system, make sure you include a requirement for accuracy and repeatability at production speeds.

Another key factor on speed is “how fast do you need to go?” The speed of your in-line automated measurement should be equal to or slightly greater than your fastest cycle time that allows you to measure all your KPI’s in real-time. When deciding on a robotic measurement solution, consider all components that influence your productivity – cycle time, robot speed, and resource time and availablilty.

Uptime

Uptime is related to throughput but is a standalone metric that is more closely related to performance. Uptime simply put, is the amount of time that the device is running and available to make measurements. Uptime can be impacted by more than just the overall reliability of the system. Make sure you find out what the maintenance schedule is for the equipment as you will most likely not be able to measure when you are doing the preventive maintenance on the system. For hardware and data confidence, it is critical to ask your vendor what the maintenance requirements are. Uptime in an automated in-line system is even more critical because often the line cannot run unless the measurement system is functioning – measurement system downtime can equal production downtime.

Flexibility

Flexibility is a key factor in your automation strategy and is often a main reason for automating. Today’s manufacturers are using concepts like palletized build allowing them to run multiple part types down the same production line. In industrial automation, the word flexibility is often over-used. Everyone markets their solution as “flexible” but what does flexibility truly mean? The answer to these questions is “it depends.” In fact, one could say that “flexibility is in the eye of the beholder.” Since flexibility is relative to the task and facility it will be located in, it is important that you don’t take vendor claims at face value when they position their solution as flexible — especially when they are referring to metrology solutions.

For a metrology device to be flexible it should, at a minimum, include the following attributes:

  1. Ease of programmability
  2. Ability to measure multiple parts and feature types with no targets or sprays required

Ease of programmability is a key for flexible metrology, but it is often misrepresented. Watching a well-trained and skilled technician program a part on the fly during a demo is not a real representation of programmability, neither are the initial programs completed by a vendor during the initial install. A metrology device should only be considered easy to program if the programming can be done by trained plant personnel without support from the metrology supplier. If you must call your vendor and cut a purchase order every time you need to make a change, the system is not easy to program and should not be considered flexible. The on-site technician from the vendor makes things look easy because it is their job, and they have a lot of time on task. Think about your employee that takes training at the original install, but then does not use the skill for six months. Ease of programmability ensures your employees can maximize the flexibility of the measurement system.

The ability to measure multiple parts and feature types is another key piece of the flexibility puzzle. It is imperative that you select a metrology device that will cover all your critical parts that require inspection. Of course, no system can measure everything, so you may at times need to apply the 80/20 rule to this flexibility requirement. The best way to know for sure if a system will measure what you need is to have the potential vendors measure some of your actual parts during the selection process. Allow the vendor to put together a nice presentation of the results, but also ask for the raw data. If the vendor is not eager to share the raw data with you, it should be a red flag about their solution.

Automation is a very valuable tool for modern manufacturers and every employee should look for areas to automate. Metrology is the perfect area for manufacturers to benefit from automation, but due diligence is required to make sure automation improves your overall productivity.

History of Automated In-Line Accurate Measurement (March 1, 2020)

Blog 1 - 2020
Looking Back ...

Prior to the advent of automated in-line measurement, the traditional dimensional quality control strategy in the automotive body shop relied on sampling production with off-line CMM machines in temperature-controlled measurement rooms. The metrology science and techniques for touch probe contact measurement were developed in the 1970s by metrology engineers in collaboration with the CMM companies. The quality engineers operating the CMM machines were highly trained metrology specialists. The absolute accuracy of the typical CMM machine in the automotive body shop could reach 0.010mm in a local area, but when assessed throughout the machine volume, a more typical accuracy of 0.100mm maximum error was more often the reality with dual arm configurations.

When Perceptron introduced plant floor hardened automated in-line measurement in the mid-1980s, the focus was on 100% measurement data and statistical process techniques for process variation reduction. The repeatability of the Perceptron technique was typically less than 0.100mm 3-Sigma. The systems were good for relative measurement typically achieving relative accuracy error on the order of 10% due to crudely measured relation from sensor coordinates to part coordinates.

And the debate over 100% vs sampling began. One big question was what to do with the overload of data? Another was how much is enough accuracy? Data confidence also became a big challenge as the laser optical technique applying image processing were subject to influences that affected the results differently than the CMM touch probes. The desire to have traceability of the in-line measurements drove a process of correlating and offsetting the in-line measurements relative to the CMM and this became a major effort for the quality engineers in the measurement rooms.

In the late 1980s, Perceptron invented and patented a technique for calibration of the in-line measurement stations directly into absolute coordinates. The technique made use of theodolites referenced to the part coordinate origin and a calibration target measurable by both the theodolites and the measurement sensor’s laser. The relation from sensor coordinates into absolute part coordinates was generated for each sensor and stored and applied to the measurements. This technique typically achieved absolute accuracy within 0.250mm when applied to fixed mounted sensors. This reduced the CMM correlation and offset process, but the differences between optical and touch probe techniques remained.

In the early 1990s, interest in flexible automation and measuring with robots positioning sensors, rather than fixed mounted sensors, for each checkpoint was growing—particularly in Japan and Korea. This was driven partly by the desire to run multiple models on a single line rather than single model dedicated tooling.

Error from robot repeatability and thermal drift had to be overcome, and Perceptron and Nissan developed high-accuracy measurement robots with rectilinear axes to allow straight forward linear thermal drift error correction. The measurement data was processed to optimize the numerically controlled tooling—an early instance of Industry 4.0 level of automation and information exchange. This was followed by techniques for applying kinematic model-based thermal compensation to standard industrial robots to reduce measurement error caused by robot thermal drift. Absolute accuracy was initially still achieved by reference measurement techniques at each checkpoint, such as the theodolite or eventually laser tracker, but results were never as accurate as with fixed-mounted sensors.

During the early 2000s, techniques to calibrate robots into absolute coordinates and sustain that calibration were developed and refined with a goal to simplify the use of measurement robots and increase the flexibility of the in-line measurement stations. The robot kinematic models and compensation techniques became more sophisticated and accurate. The industry-leading techniques developed by Perceptron to compensate for the absolute error of the robot TCP position and the relation from sensor coordinate to TCP coordinate to part coordinate could be relied on to achieve an absolute volumetric accuracy approaching 0.250mm. Standards were also developed and adopted for validating and comparing volumetric accuracy of the automated systems, such as the ISO 10360-8.

More recently, Perceptron has pioneered major advances such as optical measurement techniques and 3D point cloud laser sensors, such as the Helix sensor family. Helix was developed to produce measurements that exactly match the CMM touch probe techniques, virtually eliminating this long-standing correlation error factor. Perceptron developed self-learning software for compensating measurements such that plant floor temperature-induced dimensional changes of the measured part do not influence the measurement results. Software for split cycle configurations where different checkpoints are measured on different cycles have been introduced to maximize the in-line checkpoint coverage. And off-line programming techniques, including the use of Digital Twins to fully simulate automated systems, have simplified the programming and maintenance of the automated systems.