Augmented Reality (AR) is a tremendous trendy expression, and a topic that is truly caught the imagination of mobile application developers. In AR applications, a live perspective on the physical, certifiable condition is expanded by virtual content, giving an increasingly vivid client experience. Pokemon Go might be the principal thing that comes into view when you consider AR mobile applications and augmented reality developer, yet there’s a lot of mobile applications that outfit the intensity of AR technology. For instance, Snapchat utilizes AR to add masks and channels to the camera feed of device, and the features of Word Lens in Google Translate is fueled by AR.
What is ARCore?
ARCore is a Google platform that empowers your applications to “see” and comprehend the physical world, through your device’s camera. As opposed to depending on client input, Google ARCore consequently searches for “clusters” of feature indicates that it uses comprehend its environment. In particular, ARCore searches for clusters that show the presence of basic horizontal and vertical surfaces, for example, floors, walls, and desks, and afterward makes these surfaces accessible to your application as planes. ARCore can likewise distinguish light levels and light sources, and uses this data to make reasonable shadows for any AR objects that clients place in the augmented scene.
Steps to build an augmented reality android app
- Import the 3D models with the Sceneform plugin
Generally, working with 3D models requires special knowledge, yet with the arrival of the Sceneform plugin Google have made it conceivable to render 3D models utilizing Java – and without learning OpenGL. The Sceneform plugin gives a significant level API that can be used to make Renderdables from standard Android devices, materials or shapes, or from 3D resources, for example, .FBX or .OBJ documents. At whatever point you import a record utilizing Sceneform, this plugin will consequently:
- Convert the resource file into a .sfb record. This is a runtime-enhanced Sceneform Binary format (.sfb) that is added to your APK and afterward stacked at runtime. We’ll be utilizing this .sfb file to make a Renderable, which comprises of textures, materials and meshes, and can be set anyplace in the augmented scene.
- Create a .sfa file. This is a resource description document, which is a text file containing an intelligible depiction of the .sfb file. Contingent upon the model, you might have the option to change its appearance by altering the text within the .sfa file.
- Install the sceneform plugin
The Sceneform plugin needs Android Studio 3.1 or higher. In case you’re uncertain about the version of Android Studio no doubt about it, “Android Studio > About Android Studio” from the toolbar. The resulting popup contains some fundamental data about your Android Studio establishment, including its version number.
- Sceneform UX and Java 8
We should begin by including the dependencies we’ll use all through this project. Open your module-level build.gradle file, and include the Sceneform UX library, which consists the ArFragment. Sceneform utilizes language builds from Java 8, so we’ll additionally need to refresh Target Compatibility and Source Compatibility of the project to Java 8. At last we have to apply the Sceneform plugin.
- Request the permissions with ArFragment
The application will utilize the camera to break down its environment and position 3D models in reality. Before the application can get to the camera, it requires the camera permission, so open Manifest of your project and include the user permission:
“<uses-permission android:name=”android.permission.CAMERA”/>”
- Integrate ArFragment to the layout
When ArFragment has checked that the device can bolster your application’s AR highlights, it makes an ArSceneView ARCore session, and your application’s AR experience is all set!
- Download the 3D models with the help of Google’s Poly
There’s few distinct ways that you can make Renderables like sceneforms offers 3D assets in some formats as .FBX, .gITF, .OBJ.
Import models into Android Studio
When you have the asset, you have to bring it into Android Studio utilizing the Sceneform plugin. This is a multi-step process that expects you to:
- Make a “sampledata” folder. Sample data is another folder type for design time test information that won’t be incorporated into the APK, yet will be accessible in the Android Studio editor.
- Drag and drop original .OBJ resource file into your “sampledata” folder.
- Play out the Sceneform import and transformation on the .OBJ file, which will create the .sfb and .sfa records.
Display your model
- Create variable as an ArFragmented member- reference the fragment in MainActivity class
- Create a ModelRenderable- have to transform .sfb file to render 3D object.
- React to user input- should register a callback to involve in the plane.
- Anchor your model- retrieve an ArSceneView and attach to AnchorNode which act as the parent node of the Scene
- Include support for scaling, moving, and rotating- create TransformableNode that is responsible for scaling, moving, and rotating nodes depending on gestures.
Test your Google ARCore augmented reality app
Test the application on physically supported Android device.
Author Bio –
Alex jone is a Marketing Manager at AIS Technolabs which is Web design and Development Company, helping global businesses to grow. I would love to share thoughts on Top augmented reality companies.


