Augmented Reality; How is it created?

Augmented reality development; what is it?
Augmented reality development, if done well, is insanely useful for creating automated factories of high-complexity, high-value products such as cars and planes where there are engineers that need to install items or perform a certain type of activity to add value to the production of the item. This is achieved initially through large scale mapping of the manufacturing site using structured light 3d scanning technology. This type of technology helps augmented reality development as it is not dependent on time-of-flight calculations but the deformation of emitted known geometries onto the scanned surfaces. While the drawback is that it needs to be done in the dark for the best results, it is relatively easy to setup utilizing a project, software and at least one camera to capture the results. The reason it is so useful is that you do not have backscatter ‘anomalous data’ from reflective surfaces unlike laser-based systems. In fact, structured light 3d scanning technology was originally used in the large scale for scanning large scale military assets for assessment of manufacturing tolerances on complex and compound surfaces during prototyping of new platforms. This also led to the quick adaptation of existing products and advanced them further during upgrade projects.

The workflow of augmented reality development
Once you have the data then augmented reality development is very simple to develop an AR solution for the factory. If needed you can scan in during this process new equipment that will be used or take existing digital assets from suppliers’ websites and add them to your computer aided design (CAD) software of choice. Some provide movable parts that allow you to animate the production process to check for collision issues. Finally, for custom items a CAD engineer will need to model these and add them to the environment.

One you have these items added, you can now animate them with some CAD software designed to help with layouts and animates product production at each work-cell. You can then calculate cost per unit to assemble and ensure all manufacturing rates are optimized to stop bottlenecks or excess inventory in any one location.

Once things are working well the environment can then have ‘workers’ added. These are cameras that act as avatars for users and assigned wearable edge hardware such as a heads-up display (HUD). Once this is done, each avatar is given a GUI with digital assets that are superimposed on the assembly with instructions according to the timeframe of the manufacturing cycle or other comparable checkpoints. Once all this is done you are ready to test and roll out. Additionally, if you deal with process controls and need real-time monitoring, this can be integrated into the HUD for each relevant user through using edge devices added to monitor systems in real-time.

As you can tell structured light 3d scanning technology is the biggest factor in reducing the time for augmented reality development during any build out. Why? Because you get accuracy with more precision in manufacturing environments allowing you to focus on the actual workflow of the system and optimizing it before the project’s inevitable deadline.

Looking Ahead
If carried out properly augmented reality development allows you to use virtually untrained staff to perform in complex systems with less potential for waste. This reduces the demand on shop-floor staff to control quality and ensure there is less waste in the system. If only one subassembly is wrong it could take 80% more time to resolve and hold up production or lead to excessive scrap.