Getting Started With AR/VR/MR

Scroll down to see more

Getting Started With AR/VR/MR

Scroll down to see more

AECT 2020 Workshop
Thu, Oct 29, 1:00 to 4:00pm EDT (1:00 to 4:00pm EDT), Virtual AECT, Grand5
Curious about augmented and virtual reality but not had enough time or funding to fully explore this emerging technology? Getting Started with AR/VR/MR will provide participants a hands-on opportunity to interact with a broad spectrum of software and hardware readily available to use with your students, staff, or faculty. In this 3-hour sandbox, you'll get everything you need to get started with immersive technologies including suggestions for the implementation of software and hardware.
Faculty with VR Goggles

James Madison University Faculty Exploring Virtual Reality

Today' Agenda
1:00 pm - Introductions
? pm - An Exploration of Hardware and Software
The Easiest Path to using AR/VR/MR: The Smartphone
Arguably, the easiest way to get started with AR/VR/MR is to use a smartphone. There are already a wide variety of phones and apps on both platforms that are available to use with your students right now. Today, I'm going to be using a 2020 Apple iPhone SE (the budget model) along side a Google Pixel 3 (the Pixel 5 is the latest model).
As a word of caution, not all phones work equally and it should be noted that newer phones with multiple cameras will lead the way with the ability to scan real world object to use with VR/AR/MR. Click the image to see what it looks like in real life.

The new iPhone 12 Pro, ProMax, iPad Pro have built in Lidar Scanners for AR.

Let's Start By Looking at a Few Virtual Reality Apps First!
ARKit4 (Apple)
ARKit 4 introduces a brand-new Depth API, creating a new way to access the detailed depth information gathered by the LiDAR Scanner on iPhone 12 Pro, iPhone 12 Pro Max, and iPad Pro. Location Anchoring leverages the higher-resolution data in Apple Maps to place AR experiences at a specific point in the world in your iPhone and iPad apps.* And face tracking is now supported on all devices with the Apple Neural Engine and a front-facing camera, so even more users can experience the joy of AR in photos and videos.
AR Quick Look
Built-in apps, such as Safari, Messages, Mail, News, and Notes, use Quick Look to display USDZ files of virtual objects in 3D or AR on iPhone and iPad. You can embed Quick Look views in your apps and websites to let users see incredibly detailed object renderings in a real-world surrounding with support for audio playback.
3D Models
Tap any of the 3D models here on a iPhone or iPad to view the object and place it in AR.* Or click a model on Mac to download the USDZ file.
ARCore (Android)
ARCore is Google’s platform for building augmented reality experiences. Using different APIs, ARCore enables your phone to sense its environment, understand the world and interact with information. Some of the APIs are available across Android and iOS to enable shared AR experiences.
ARCore uses three key capabilities to integrate virtual content with the real world as seen through your phone's camera:
Motion tracking allows the phone to understand and track its position relative to the world.
Environmental understanding allows the phone to detect the size and location of all type of surfaces: horizontal, vertical and angled surfaces like the ground, a coffee table or walls.
Light estimation allows the phone to estimate the environment's current lighting conditions.
ARCore is designed to work on a wide variety of qualified Android phones running Android 7.0 (Nougat) and later. A full list of all supported devices is available here.

Back to Top