Blog

We believe there is something unique at every business that will ignite the fuse of innovation.

Machine Learning (ML), Augmented Reality (AR), and user engagement were some of the core tenants at Google IO '18. Google has doubled down on their investment in ML, and they are making it readily available for enterprise Android and iOS applications via Firebase ML Kit. AR for Android (ARCore) got a facelift with new features for cross device and platform AR experiences. And finally, true to Google's core, Google has made connecting users with applications that fit their needs a top priority with App Actions and Slices.

Among these updates, here are some announcements that businesses should pay close attention to:

App Actions & Slices

Getting your application discovered amongst the millions of apps in the Play Store is reminiscent of the early days of the web. Then, once you have gotten the coveted install, engaging the user with your application without spamming them with notifications is really difficult — Actions and Slices are here to help.

Actions

To help connect users with your application, Google introduced App Actions. Actions, is a form of a deeplink that is shown to the user when they need it most. For example, if your application can provide the user their credit score and the user is searching "credit score" in the Search App, your application can be surfaced to the user and take them directly to the feature in your application that can help. This can increase user engagement and usage with your apps.

actions screen demo

Slices

Slices is similar to Actions, with the additional added benefit that you can show interactive features to users as well. You can show users dynamic information that you choose based on the context of what the user is searching for at the time.

slices blog

App Bundles and Dynamic Features

At Google IO, Google stated that for every 6 MB increase to your app size, you can expect to see 1% fewer installs in all markets. App Bundles is a new file type provided for your Android Application. It supports only giving users the files they need for their specific device/preferences rather than delivering all the specifications that are not relevant to that specific user's device variation. This will reduce download sizes which can lead to more installs.

Furthermore, App Bundles pave the way for Dynamic Features. With Dynamic Features, we no longer have to balance new features with a slowing install rate. You can now build features and the resources associated with that feature isn't delivered until the user attempts to use it. That way, features that aren’t targeted at every user won’t slow down everyone’s experience.

app bundle animation

ARCore Enhancements

AR was a key focus at Google IO '18 with three large enhancements being added to the existing ARCore for Android.

Sceneform

Sceneform paves the way for AR development on Android. Prior to Sceneform, developers had to be proficient in OpenGL and/or Unity to create AR experiences. Sceneform drastically reduces the barrier to entry for AR experiences in your enterprise application. So if you haven’t considered adding AR to you application already — now might be the time.

Cloud Anchors

One of the most exciting enhancements made for both iOS and Android is Cloud Anchors. With Cloud Anchors, users on different devices can partake in the same AR experience. For example, if one user has placed an AR character at a specific point in the world, other users in the area would see the same character when viewing that same spot through their device. This has potential to allow for interactive and rich user experiences that weren’t possible before.

cloud anchor

Augmented Images

The addition of Augmented images for Android allows the user to interact with 2D surfaces such as posters or logos. For example, a company that supplies products requiring instructions could have a partnering app that when directed at a specific part of the packaging, could make the use of the product come alive in a 3D demo right on the box.

augmented AR

Firebase ML Kit

Firebase ML Kit is a Machine Learning framework that enables your Android or iOS applications to use models trained by Google. These models currently include:

  • Text Recognition
  • Face Detection
  • Barcode Scanning
  • Image Labeling (labeling items in a specific image)
  • Landmark Recognition

If none of the above models fits your needs, Firebase ML allows you to upload your own model for use as well. This is helping put machine learning in the reach of most users.

Android Jetpack

Google announced a new set of tools for Android Developers to use to make commonly built functionality more standardized. It will make development more straightforward and efficient. If used correctly this should also result in less defective code. Prior to this standardization, there was multiple ways to do everything with no true guidance from Google on which one is the preferred practice. Google is becoming more opinionated it the developer tooling they are providing and asking developers to follow standardized best practices.

What's Next?

ML and AR are the clear future for enterprise mobile applications. It is becoming increasingly important for you take a look at what that means for your application's roadmap. Though ML and AR may require more long term planning, much of the aforementioned announcements can be taken advantage of improve user experience and build advanced applications.

Portions of this page are reproduced from work created and shared by the Android Open Source Project and used according to terms described in the Creative Commons 2.5 Attribution License.

About the Author

Clinton Teegarden

Clinton Teegarden is a Senior Software Engineer and Manager at CapTech based in the Washington, DC metro area. Clinton is passionate about bringing the greatest technologies in both Mobile and IoT to his clients using proven architecture and design patterns. Clinton specializes in all things Android and has led teams in delivering products for Fortune 500 clients, servicing millions of users.