Application of Augmented Reality


Augmented Reality (AR) is a new technology that demands the overlay of computer graphics on the real world (Figure 1). One of the top overviews of the technology are [4], that defined the field, described many problems, and summarized the developments up to that point. That paper gives a starting point for anyone interested in researching or using Augmented Reality

AR is within a more general context termed Mixed Reality (MR) [20], which refers to a multi-axis spectrum of areas that cover Virtual Reality (VR), AR, telepresence, and other related technologies.

Virtual Reality is a term used for computer-generated 3D environments that allow the user to enter and interact with synthetic environments [9][27][28]. The users are able to “submerge” themselves to varying degrees in the computer artificial world which may either be a simulation of some form of reality [10] or the simulation of a complex phenomenon [33][9]. 

Figure 1: AR example with virtual chairs and a virtual lamp.
Figure 1: AR example with virtual chairs and a virtual lamp. 

In telepresence, the fundamental motive is to expand the operator’s sensory-motor facilities and problem-solving abilities to a remote environment. In this sense, telepresence can be defined as a human/machine system in which the human operator receives sufficient information about the teleoperator and the task environment, displayed in a sufficiently natural way, that the operator feels physically present at the remote site [26]. Very similar to virtual reality, in which we aim to achieve the illusion of presence within a computer simulation, telepresence aims to achieve the illusion of presence at a remote location.

VR and telepresence. While in VR the environment is entirely synthetic and in telepresence it is completely real, in AR the user sees the real world augmented with virtual objects.

When scheming an AR system, three aspects must be in mind: (1) Combination of real and virtual worlds; (2) Interactivity in real time; (3) Registration in 3D. 

Wearable devices, like Head-Mounted Displays (HMD) [28], could be used to show the augmented scene, but other technologies are also available [4]. 

Apart from the mentioned three aspects, another one could be incorporated: Portability. In almost all virtual environment systems, the user is not allowed to go around much due to devices limitations. Still, some AR applications will need that the user really walks through a large environment. Thus, portability becomes an important issue. 

For such applications, the 3D registration becomes even more complicated. Wearable computing applications normally provide unregistered, text/graphics information using a monocular HMD. These systems are more of a ”see-around” setup and not an Augmented Reality system by the limited definition. From this, computing platforms and wearable display devices used in AR must be often developed for more general applications (section 3). 

The field of Augmented Reality has lived for just over one decade, but the growth and progress in the past few years have been remarkable [12]. Since [4], the field has grown rapidly. A number of conferences specialized in this area were started, including the International Workshop and Symposium on Augmented Reality, the International Symposium on Mixed Reality, and the Designing Augmented Reality Environments workshop.

2 AR Components

2.1 Scene Generator

The scene generator is the device or software responsible for rendering the scene. Rendering is not currently one of the major issues in AR, as a few virtual objects need to be drawn, and they many times do not necessarily have to be realistically rendered in order to serve the purposes of the application [4]. 

2.2 Tracking System 

The tracking system is one of the most important problems in AR systems mostly because of the registration problem [3]. The objects in the real and virtual worlds must be properly lined up with respect to each other, or the phantasm that the two worlds coexist will be compromised. For the industry, many applications demand accurate registration, especially on medical systems [16][4].

2.3 Display

The technology for AR is still in growth and solutions depend on design decisions. Almost all Displays devices for AR are HMD (Head Mounted Display), but other solutions can be found (see section 3). When uniting the real and virtual world two basic choices are available: optical and video technology. Each of them has some trade-offs depending on factors like resolution, flexibility, field-of-view, registration strategies, among others [4]. Display technology continues to be a limiting factor in the growth of AR systems. There are still no see-through displays that have sufficient brightness, resolution, field of view, and contrast to seamlessly blend a wide range of real and virtual imagery.Moreover, many technologies that begin to approach these goals are not yet sufficiently small, lightweight, and low-cost. However, the past few years have seen a number of advances in see-through display technology, as we shall see next.

3 AR Devices

Four major classes of AR can be distinguished by their display type: Optical See-Through, Virtual Retinal Systems, Video See-Through, Monitor Based AR and Projector Based AR. The following sections show the corresponding devices and present their main features

3.1 Optical See-Through HMD

Optical See-Through AR uses a transparent Head-Mounted Display to show the virtual environment directly over the real world (Figures 2 and 3). It works by put optical combiners in front of the user’s eyes. These combiners are partly transmissive so that the user can look directly through them to see the real world. The combiners are also partly reflective so that the user sees virtual images bounced off the combiners from head-mounted monitors.

Key examples of an Optical See-through AR system are the various augmented medical systems. MIT Image-Guided Surgery has intensed on brain surgery [15]. UNC has been working with an AR-enhanced ultrasound system and other ways to overlying radiographic images on a patient [23]. There are numerous Optical See-through systems, as it seems to be the main direction for AR. 

Despite these specific examples, there is still a lack of general-purpose see-through HMDs. One issue for Optical See-through AR is the alignment of the HMD optics with the real world. A good HMD permits adjustments to fit the eye position and comfort of individual users. It should also be easy to move it out of the way when not needed. Nonetheless, these movements will alter the registration of the VE over the real world and require re-calibration of the system. An expensive solution would be to instrument the adjustments, so the system could auto-magically compensate for the motion. Such devices are not described in the literature. 

Figure 2: Optical See-Through HMD
Figure 2: Optical See-Through HMD

Figure 3: Optical See-Through Scheme
Figure 3: Optical See-Through Scheme

Recent Optical See-Through HMD’s are being built for well-known companies like Sony and Olympus and have support for occlusion, varying accommodation (the process of focusing the eyes on objects at a particular distance). There are very small models that can be attached to conventional eyeglasses (Figure 4). 

Figure 4: Eyeglass display with holographic element
Figure 4: Eyeglass display with holographic element

3.2 Virtual Retinal Systems

The VRD (Virtual Retinal Display) was designed at the University of Washington in the Human Interface Technology Lab (HIT) in 1991. The point was to create a full shading, wide field-of-see, high goal, high splendor, minimal effort virtual showcase. Microvision Inc. has the select permit to market the VRD innovation (Figure 5) 
This innovation has numerous possible applications, from head-mounted showcases (HMDs) for military/aviation applications to clinical purposes. The VRD ventures a regulated light emission (from an electronic source) legitimately onto the retina of the eye creating a rasterized picture (Figure 6). The watcher has the figment of seeing the source picture as though he/she stands two feet away before a 14-inch screen. As a general rule, the picture is on the retina of its eye and not on a screen. The nature of the picture he/she sees is fantastic with the sound system see, full shading, wide field of view and no glimmering attributes [13][24]

Figure 5: Virtual Retinal System HMD.
Figure 5: Virtual Retinal System HMD.

Figure 6: Virtual Retinal System Scheme.
Figure 6: Virtual Retinal System Scheme.

3.3 Video See-Through HMD

Video See-Through AR utilizes a hazy HMD to show a blended video of the VE and view from cameras on the HMD (Figure 7). This methodology is more perplexing than optical transparent AR, requiring the best possible area of the cameras (Figure 8). In any case, the video sythesis of the genuine and virtual universes is a lot simpler. There is an assortment of arrangements accessible including chroma-key and profundity planning. Blended Reality Systems Lab (MRSL) of Japan introduced a sound system video transparent HMD at ISAR 2000. This gadget tends to a portion of the parallax identified with the area of the cameras versus eyes.
Figure 7: Video See-Through HMD
Figure 7: Video See-Through HMD

Figure 8: Video See-Through Scheme.
Figure 8: Video See-Through Scheme.

3.4 Monitor Based

Monitor-Based AR likewise utilizes blended video streams yet the showcase is an increasingly regular work area screen or a handheld a presentation. It is maybe the least troublesome AR arrangement, as it disposes of HMD issues. Princeton Video Image, Inc. has built up a method for consolidating illustrations into continuous video streams. Their work is consistently observed as the first down line in American football match-ups. It is likewise utilized for putting publicizing logos into different communicates.

3.5 Projection Displays

Projector Based AR uses real-world objects as the projection surface for the virtual environment (Figures 11,12).

Figure 9: Monitor Based Scheme
Figure 9: Monitor Based Scheme

Figure 10: Monitor Based Example.
Figure 10: Monitor Based Example.

It has applications in industrial gathering, product visualization, etc. Projector based AR is also well suited to multiple user situations. The alignment of projectors and the projection surfaces is critical for successful applications. 

Figure 11: Projector Based AR.
Figure 11: Projector Based AR.

4 Applications

Augmented Reality technology has many possible applications in a wide range of fields, including entertainment, education, medicine, engineering, and manufacturing. 
It is normal that other possible regions of use will show up with the spread of this innovation.

Figure 12: Projector Based AR
Figure 12: Projector Based AR

4.1 Medical 

Since imaging innovation is so unavoidable all through the clinical field, it isn’t astonishing that this area is seen as one of the more significant for enlarged reality frameworks. The majority of the clinical applications manage picture guided medical procedures (Figure 13) [15].
Pre-operative imaging studies of the patient, such as CT (Computed Tomography) or MRI (Magnetic Resonance Imaging) scansgive the specialist the important perspective on the inner life systems. From these pictures the medical procedure is arranged. Perception of the way through the life structures of the influenced zone (where a tumor must be expelled, for instance) is finished by first making a 3D model from the numerous perspectives and cuts in the pre-usable investigation. The model is then anticipated over the objective surface to support the surgery. Increased reality can be applied with the goal that the careful group can see the CT or MRI information accurately enrolled on the patient in the working theater while the methodology is advancing. Having the option to precisely enroll the pictures now will improve the exhibition of the careful group and wipe out the requirement for the agonizing and bulky stereotactic outlines that are right now utilized for enlistment [15]. 
Another application for enlarged reality in the clinical space is in ultrasound imaging [2]. Utilizing an optical transparent showcase the ultrasound specialist can see a volumetric rendered picture of the baby overlaid on the mid-region of the pregnant lady. The picture shows up as though it were within the mid-region and is effectively rendered as the client moves [25] (Figure 14).

4.2 Entertainment

A straightforward type of augmented reality has been being used in the diversion and news business for a long while. At whatever point you are watching the night climate projection, the speaker stays remaining before changing climate maps. In the studio the journalist is really remaining before a blue screen. This genuine picture is expanded with PC created maps utilizing a method called chroma-keying. Another diversion territory where AR is being applied is on game improvement [31] (Figures 15 and 16). Princeton Electronic Billboard has built up an increased reality framework that permits supporters to embed promotions into explicit regions of the communicate picture (Figure 17). For instance, while broadcasting a ball game this framework would have the option to put a promotion in the picture with the goal that it shows up on the outfield mass of the arena

Figure 15: Games using a virtual table and synthetic objects.
Figure 15: Games using a virtual table and synthetic objects.

The electronic board expects adjustment to the arena by taking pictures from ordinary camera edges and zoom settings so as to construct a guide of the arena remembering the areas for the pictures where promotions will be embedded. By utilizing pre-indicated reference focuses in the arena, the framework consequently decides the camera edge being utilized and alluding to the pre-characterized arena map embeds the ad into the right spot.

Figure 17: Advertisement on a Football game.
Figure 17: Advertisement on a Football game.

4.3 Military Training

The military has been utilizing shows in cockpits that current data to the pilot on the windshield of the cockpit or the visor of the flight protective cap (Figure 18). This is a type of expanded reality show. By furnishing military staff with a head protector mounted visor shows or a particular reason rangefinder the exercises of different units partaking in the activity can be imaged. While taking a gander at the skyline, during a preparation area, for instance, the presentation prepared warrior could see a virtual helicopter transcending the tree line. This helicopter could be being flown in recreation by another member. In wartime, the presentation of the genuine front line scene could be increased with comment data or featuring to stress shrouded adversary units [32].

4.4 Engineering Design

Envision that a gathering of architects is dealing with the model of a mind-boggling gadget for their customers. The originators and customers need to do a joint structure audit despite the fact that they are truly isolated. In the event that every one of them had a meeting room that was furnished with an increased reality show, this could be cultivated. The physical model that the planners have modeled is imaged and shown in the customer’s gathering room in 3D. The customers can stroll around the presentation taking a gander at various parts of it. To have conversations the customer can point at the model to feature segments and this will be pondered the genuine model in the expanded presentation that the originators are utilizing. Or then again maybe in a previous phase of the structure, before a model is manufactured, the view in every gathering room is expanded with a PC created picture of the current plan worked from the CAD documents depicting it [1] (Figure 19).

Figure 19: AR applied to Engineering Design. This figure shows a real object augmented with virtual tubes
Figure 19: AR applied to Engineering Design. This figure shows a real object augmented with virtual tubes

4.5 Robotics and Telerobotics

In the domain of robotics and telerobotics and the augmented display can help the user of the system [17][21]. A telerobotic operator utilizes a visual picture of the distant workspace to control the robot. Comment of the view would be helpful as it is the point at which the scene is before the administrator. Furthermore, growth with wireframe drawings of structures in the view can encourage a perception of the far off 3D geometry. In the event that the administrator is endeavoring a movement, it could be drilled on a virtual robot that is envisioned as an increase to the genuine scene. The administrator can choose to continue with the movement in the wake of seeing the outcomes. The robot movement could then be executed straightforwardly which in a telerobotics application would take out any motions brought about by long deferrals to the distant site. Another utilization of mechanical autonomy and AR is on distant clinical activity (Figures 20 and 21).

Figure 20: Virtual surgery using robot arms.
Figure 20: Virtual surgery using robot arms.

Figure 21: Robotics using AR for remote medical operation
Figure 21: Robotics using AR for remote medical operation

4.6 Manufacturing, Maintenance and Repair

At the point when the support expert methodologies another or new bit of gear as opposed to opening a few fix manuals, they could put on an enlarged reality show. In this showcase, the picture of the gear would be increased with comments and data relevant to the fix. For instance, the area of clasp and connection equipment that must be evacuated would be featured (Figure 22). Boing made an exploratory framework, where the experts are guided by the enlarged presentation that shows the directing of the links on a nonexclusive casing utilized for all bridles (Figure 23). The increased presentation permits a solitary apparatus to be utilized for making the various saddles [29]
Figure 22: AR used to aid mechanical work.
Figure 22: AR used to aid mechanical work.

Figure 23: AR applied to maintenance work.
Figure 23: AR applied to maintenance work.

4.7 Collaborative AR

AR addresses two key issues with collaboration: logical integration with existing tools and practices, and enhancing practice by supporting remote and co-located activities that would otherwise be impossible. Collaborative AR systems have been constructed using projectors, hand-held, and head-worn displays. By utilizing projectors to increase the surfaces in a cooperative situation, clients are unhampered, can see each other’s eyes, and are ensured to see similar enlargements [6]. Instances of communitarian AR frameworks utilizing transparent showcases incorporate both those that utilization transparent handheld shows and transparent head-worn presentations [14] (Figure 24).

5 Visualization Issues

Specialists have started to address issues in showing data in AR, brought about by the idea of AR innovation or presentations. Work has been done in the adjustment of enrollment blunders and abstaining from concealing basic information because of thickness issues.

Figure 24: The Studierstube collaborative AR system.
Figure 24: The Studierstube collaborative AR system.

5.1 Visualization Errors

In some AR frameworks, enlistment blunders are noteworthy and unavoidable. For instance, the deliberate area of an item in the earth may not be known precisely enough to evade noticeable enrollment blunder. Under such conditions, one methodology for rendering an item is to outwardly show the territory in screen space where the article could live, in view of the expected following and estimation mistakes [19]. This ensures the virtual portrayal consistently contains a genuine partner. Another methodology when rendering virtual articles that ought to be impeded by genuine items is to utilize a probabilistic capacity that step by step becomes dull the shrouded virtual article along the edges of the blocked district, making enrollment mistakes less shocking [11].

5.2 Removing real objects from the environment 

The issue of evacuating genuine articles is more than basically removing profundity data from a scene. The framework should likewise have the option to fragment singular items in that condition. A self-loader strategy for distinguishing objects and their areas in the scene through outlines is found in [18]. This empowers the addition of virtual articles and erasure of genuine items without an unequivocal 3D remaking of the earth (Figure 25).

Figure 25: Virtual/Real occlusions. The brown cow and tree are virtual (the rest is real)
Figure 25: Virtual/Real occlusions. The brown cow and tree are virtual (the rest is real)

5.3 Photorealistic Rendering

A key prerequisite for improving the rendering nature of virtual articles in AR applications is the capacity to naturally catch the ecological light data [7][22][30]. For instance, in [8] it is introduced a strategy that, utilizing just an uncalibrated camera, permits the catch of item geometry and appearance, and afterward, at a later stage, rendering and AR overlay into another scene.

6 Conclusions and Future Work 

Notwithstanding the numerous ongoing advances in AR, much work stays to be finished. Application improvements can benefit from outside intervention by utilizing accessible libraries. One of them is ARToolkit [5], which gives PC vision methods to ascertain a camera’s position and direction comparative with checked cards so virtual 3D articles can be overlaid correctly on the markers. Here are a few regions requiring further examination if AR is to turn out to be regularly sent. 
Pervasive following and framework movability: Several noteworthy AR exhibitions have produced convincing situations with almost pixel-precise enlistment. Be that as it may, such exhibitions work just inside limited, painstakingly arranged conditions. A definitive objective is the following framework that bolsters exact enlistment in any discretionary ill-equipped condition, inside or outside. Permitting AR frameworks to go anyplace likewise requires compact and wearable frameworks that are agreeable and subtle. 
The simplicity of arrangement and use: Most existing AR frameworks require master clients (by and large the framework architects) to align and work them. In the event that AR applications are to get typical, at that point, the frameworks must be conveyed capable and operable by non-master clients. This requires increasingly powerful frameworks that maintain a strategic distance from or limit adjustment and arrangement necessities. 
Photograph reasonable and propelled rendering: Although numerous AR applications just need basic illustrations, for example, wire outline layouts and text marks, a definitive objective is to render the virtual articles to be undefined from the genuine ones. This must be done continuously, without the manual mediation of craftsmen or software engineers. New strategies in picture-based rendering must be considered so as to achieve this undertaking [7]. 
AR in all detects: Researchers have concentrated fundamentally on increasing the visual sense. In the long run, convincing AR conditions may require connecting with different faculties too (contact, hearing, and so on.).


[1] K. Ahlers and A. Kramer. Distributed augmented reality for collaborative design applications. European Computer Industry Research Center, 3-14, 1995. 

[2] S. Andrei, D. Chen, C. Tector, A. Brandt, H. Chen, R. Ohbuchi, M. Bajura, and H. Fuchs. Case study: Observing a volume rendered fetus within a pregnant patient. Proceedings of IEEE Visualization, 17-21, 1993

[3] R. Azuma. Tracking requirements for augmented reality. Communications of the ACM, 36(7):50-51, 1993. 

[4] R. Azuma. A survey of augmented reality. ACM SIGGRAPH, 1-38, 1997. 

[5] M. Billinghurst, S. Baldis, E. Miller, and S. Weghorst. Shared space: Collaborative information spaces. Proc. of HCI International, 7-10, 1997. 

[6] M. Billinghurst and H. Kato. Mixed reality – merging real and virtual worlds. Proc. International Symposium on Mixed Reality (ISMR ’99), 261-284, 1999. 

[7] S. Boivin and A. Gagalowicz. Imagebased rendering for industrial applications. ERCIM News, 2001. 

[8] D. Cobzas, K. Yerex, and M. Jagersand. Editing real world scenes: Augmented reality with image-based rendering. Proc. of IEEE Virtual Reality, 291- 292, 2003. 

[9] A. Van Dam, A. Forsberg, D. Laidlaw, J. LaViola, and R. Simpson. Immersive VR for scientific visualization: A progress report. IEEE Computer Graphics and Applications, 20(6): 26- 52, 2000. 

[10] P. du Pont. Building complex virtual worlds without programming. EUROGRAPHICS’95 State Of The Art Reports, 61–70, 1995. 

[11] A. Fuhrmann et. al. Occlusion in collaborative augmented environments. Computers Graphics, 23 (6): 809-819, 1999. 

[12] R. Azuma et al. Recent advances in augmented reality. IEEE Computer Graphics and Applications, 20-38, 2001. 

[13] R. Chinthammit et al. Head tracking using the virtual retinal display. Second IEEE and ACM International Symposium on Augmented Reality, 235-242, 2001

[14] Z. Szalavri et. al. Studierstube: An environment for collaboration in augmented reality. Virtual Reality Systems, Development and Application, 3 (1): 37-48, 1998. 

[15] W. Grimson, G. Ettinger, T. Kapur, M. Leventon, W. Wells, and R. Kikinis. Utilizing segmented MRI data in imageguided surgery. International Journal of Pattern Recognition and Artificial Intelligence, 11(8):1367-97, 1998. 

[16] Richard Lee Holloway. Registration errors in augmented reality systems. Technical Report TR95-016, The University of North Carolina, 1 1995. 

[17] W. S. Kim and P. Schenker. An advanced operator interface design with preview/predictive displays for groundcontrolled space telerobotic servicing. Proceedings of SPIE Vol. 2057: Telemanipulator Technology and Space Telerobotics, 96-107, 1993. 

[18] V. Lepetit and M. Berger. Handling occlusions in augmented reality systems: A semi-automatic method. Proc. Int.l Symp. Augmented Reality, 137-146, 2000. 

[19] B. MacIntyre and E. Coelho. Adapting to dynamic registration errors using level of error (loe) filtering. Proc. Int.l Symp. Augmented Reality, 85-88, 2000. 

[20] P. Milgram and F. Kishino. A taxonomy of mixed reality visual displays. IEICE Transactions on Information Systems, E77-D (12): 1321-1329, 1994. 

[21] P. Milgram and S. Zhai. Applications of augmented reality for human-robot communications. Proceedings of 1993 IEEE/RSJ International Conference on Intelligent Robots and Systems, 1467- 1476, 1993. 

[22] Y. Mukaigawa, S. Mihashi, and T. Shakunaga. Photometric imagebased rendering for virtual lighting image synthesis. Proc. 2nd Int.l Workshop Augmented Reality (IWAR ’99), 115-124, 1999.

[23] Ryutarou Ohbuchi, David Chen, and Henry Fuchs. Incremental volume reconstruction and rendering for 3D ultrasound imaging. Visualization in Biomedical Computing, 1808: 312-323, 1992. 

[24] H.L. Pryor, T.A. Furness, and E. Viirre. The virtual retinal display: A new display technology using scanned laser light. Proc. 42nd Human Factors Ergonomics Society, pp. 1149, 1998. 

[25] F. Sauer, A. Khamene, B. Bascle, L. Schimmang, F. Wenzel, and S. Vogt. Augmented reality visualization of ultrasound images: System description, calibration and features. IEEE and ACM International Symposium on Augmented Reality, 30-39, 2001. 

[26] T.B. Sheridan. Musing on telepresence and virtual presence. Presence, 1(1):120-125, 1992. 

[27] W. Sherman and A. Craig. Understanding Virtual Reality: Interface, Applications and Design. Morgan Kaufmann Publishers, 2003. 

[28] R. Silva and G. Giraldi. Introduction to virtual reality. Technical Report: 06/2003, LNCC, Brazil, 2003. 

[29] D. Sims. New realities in aircraft design and manufacture. IEEE Computer Graphics and Applications, 14 (2): 91, 1994. 

[30] J. Stauder. Augmented reality with automatic illumination control incorporating ellipsoidal models. IEEE Trans. Multimedia, 1 (2): 136-143, 1999. 

[31] Z. Szalavri, E. Eckstein, and M. Gervautz. Collaborative gaming in augmented reality. VRST, Taipei, Taiwan, 195–204, 1998. 

[32] E. Urban. The information warrior. IEEE Spectrum, 32 (11): 66-70, 1995. 

[33] D. Weimer. Frontiers of Scientific Visualization, chapter 9, ”Brave New Virtual Worlds”., pages 245–278. John Wiley and Sons, Inc., 1994.

Leave a Reply

Your email address will not be published. Required fields are marked *