How Augmented Reality and Artificial Intelligence Change Everything?
According to history, we can see the transformation of communication.
- In the 1970s when people started using text characters to talk directly with computers — the MS-DOS operating system from Microsoft.
- Then along came GUI. Apple introduced the Macintosh in 1984, which was followed a few years later with Microsoft Windows. The GUI was an essential stepping stone into the Worldwide Web, which greatly expanded what we could do with personal devices.
- The third transformation came in 2007 with the iPhone, followed by Android phones. Touch became the primary interface, and it transformed personal computing from everyone to everywhere.
- The fourth transformation — Move technology from what we carry to what we wear. The user interface will move from a screen you tap to computer-generated images that you actually touch and feel. Instead of inputting with our fingers, we will type much faster with our eyes on virtual keyboards. In previous transformations, it was all about the interface between technology and people; now it becomes all about the experience — and that changes nearly everything.
By 2020, they will ignite a tsunami of adoption. People will buy the new devices for a single reason, perhaps entertainment or medical assistance, and then will find more and more reasons to use the glasses. They will start moving, a few minutes at a time, from their phones to their glasses.
By 2025, people will be spending far more time with smart glasses, and the smartest business will have shifted their efforts from aging mobile apps to the most innovative MR applications.
The Fourth Transformation facts:
- You can manipulate the computer-generated images in your field of view.
- The computer and graphic processing units, power supplies, Wi-Fi, and Bluetooth connections will be inside your eyeglass frame.
- Your glasses will connect to the Internet of Things and will use machine learning to get progressively smarter about you.
- Your glasses will conduct transactions either automatically by gestures, by voice command, or, overwhelmingly, by simple and natural eye movements.
- 360 sounds — an immersive experience
- Point Cloud — Point clouds take sets of data and convert them into visible virtual objects. Over time, they are getting very fast in converting data into 3D images that precisely render the relationships between you and the objects that surround you.
- Spatial Computing — computers can learn the contextual implications of location and the relationships of objects to each other through point clouds.
- Artificial Intelligence (AI) — Visual perception, speech recognition, decision-making, and language translation.
- Visual Web — On the visual web, people will buy and sell things online, wordlessly, by image recognition. Blippar App has the car, face, object, and logo recognitions — when AR meets AI. https://www.blippar.com
Here are some examples of the fourth transformation:
- Mark Zuckerberg spent billions to acquire Oculus Rift
- Ford is using VR to design cars
- Sephora has virtual lipstick and eyeliner that you try on your own face using your mobile phone