Apple Brought AR to Life

By Dan Sung, METRO

Screen Shot 2017-10-13 at 11.03.13 AM.png

APPLE’S most significant launch of 2017 was not the iPhone X. Nor was it the Watch Series 3. In fact, it wasn’t a device at all but the iOS 11 software — specifically the framework inside it called ARKit, which will change the world as we know it.

ARKit is a set of tools for iPhone app developers, the people who turn pretty gadgets into genuinely useful items. ARKit makes it simpler for them to dream up augmented reality (AR) apps — those which superimpose computer-generated images on top of our view of the real world. It also makes AR work a whole lot better. Chances are you’ve already used augmented reality, whether you know it or not, through Pokémon Go or Snapchat and its animated selfie additions and filters, not to mention the dancing hotdog, which was viewed 1.5billion times over the summer.

Screen Shot 2017-10-13 at 11.07.16 AM.png

Futuristic dream: Developers at Microsoft HoloLens are pointing the way, while Magic Leap has raised a staggering $1.4bn of backing for its projects

See Metro’s top ARKit apps (below) on the App Store for a taster of how Apple is set to make AR more fun and useful than ever before.

Augmented reality is nothing new but, much like its cousin virtual reality, it’s been waiting for someone to realise its potential. The success of Pokémon Go and its 650 million downloads was a marker of the appetite for AR, and this was nothing sophisticated. With ARKit, your phone can recognise the surfaces of your surroundings and put the virtual objects right on top of your desk or your car, instead of just leaving them floating aimlessly in the air. The iPhone’s dual cameras can sense depth far better, track your every movement and adjust the computer-generated overlays to match in a realistic way thanks to the high-performance A11 processor at the heart of the latest Apple handsets.

More important than the tech itself are the millions of iPhones and iPads already out there with potential AR users just waiting to get going. That’s a very different situation from the last big push for AR, which came courtesy of the Google Glass Project and its prohibitively expensive smart eyewear, its limited applications, its ‘glassholes’ label and the host of bad press around privacy.

‘The fixation that AR is dependent on consumer-grade AR glasses as the essential element for widespread usage is now over,’ says multimedia expert Ken Blakeslee of WebMobility Ventures.

Screen Shot 2017-10-13 at 11.07.33 AM.png

‘Pokémon Go taught millions of consumers the value of the smartphone screen as a viable experience for AR. Apple has just put in place the platforms for rampant AR development and, with the capabilities of the iPhones 8 and X, everything is in place to finally exploit AR in a way consumers will find comfortable and valuable.’

Since Glass Explorer ended in 2015, Google has learned much about AR and now has a set of software tools for Android called ARCore. Expect an AR app explosion on Google-based smartphones.

But the futuristic dream of smart glasses isn’t dead, it’s just that our way of getting there has changed. According to Blakeslee, the focus on smartphone AR is going to accelerate the investment in AR eyewear that looks good and works perfectly. There are already rumours that Apple is in talks with optics firm Carl Zeiss to make Apple glasses.

Microsoft is trying out a more mixed-reality approach with its oversized, more VR-looking HoloLens headgear. The idea is to turn the space around you into a 3D version of your computer desktop. It takes your pictures, your graphs and your windows off your monitor and throws them out into the real world.

Graphic models of the solar system can float around your table, PowerPoint presentations can rise up from your desk and it’s a lot easier to achieve because none of the computer-generated wizardry is reacting to the world around it. Samsung, Acer, Lenovo and Dell all have HoloLens headgear in the pipeline.

While Apple and Google will create the appetite, Microsoft HoloLens can begin to build the future along with companies like the mysterious Magic Leap, according to co-president of the VR/AR Association, Steve Dann.

‘HoloLens is pointing the way’, he says. ‘Microsoft has been very brave to bring it out early and Magic Leap has had more money pumped into it than any new company in tech ever. A lot of people are betting that it not only works but that it’s going to be relevant for an awful lot of people. We’ve all got our fingers crossed.’

Vision: HoloLens turns space around you into a 3D version of your computer desktop

Magic Leap is the sweet spot in the middle, the name that’s been teasing the idea of super-smooth AR through its next-generation headset since 2016. Video demos of little cartoon characters hiding under objects have been enough to earn the company $1.4bn of backing. Word is that the device is bigger than a normal pair of glasses but smaller than VR goggles and that it projects a ‘light field’ of virtual images directly on to your retinas alongside everything else you normally perceive. Could it be the first company to crack smart glasses?

Screen Shot 2017-10-13 at 11.07.33 AM.png

‘I’m pretty confident we’ll be wearing smart glasses within the next two or three years, looking slightly oversized but nothing that would be out of place on the street,’ says Dann. ‘After five or ten years they’ll be indistinguishable from normal glasses and people will have forgotten what life was like without augmented reality everywhere we go.’

With so many players in the game, it’s unlikely any company is going to own the space but Apple chief Tim Cook might just be able to look back and say that it was his firm that made AR finally click.

iPhone AR apps to try

IKEA Place (free)

Try before you buy and drop a true-to-scale piece of IKEA furniture into your living room before going to a showroom.

Hair Color by Modiface (£1.99)

Find out what you’d look like as a blonde or redhead. Front, side, back; it tints and tracks the lot.

Fitness AR (£2.99)

Show off your bike ride to your friends. Link it up to Strava to create a 3D tabletop map of your Herculean trail.

The Machines (£4.99)

Apple’s showcase AR gaming fun – a robot battlefield on your very own desk. Your orders, general?

Source

ARCore + ARkit = Half a Billion Devices by Year-End (new report)

Subscribe to ARtillry Insights here.

ARCore + ARkit Installed Base.png

There's been lots of talk about mobile AR's opportunity. And the best way to quantify that is through the installed base of AR-compatible devices in the market. So ARtillry set out to do just that in its latest report, accessible through VRARA's ARtillry Insights subscription

The verdict: There will be 505 million AR-compatible smartphones by the end of 2017 and 4.2 billion by 2020. That may seem like steep growth, but is a function of hardware replacement cycles for iOS and Android (2.5 years) which will happen rapidly (methodology below).  

One question is which AR platform is positioned better for growth and market share? It won’t be a winner take all market, just as iOS and Android have coexisted for years. And there is evidence that they’ll have some compatibility, or at least portability of graphical assets.

But they’ll still compete on many levels, and there are signals that indicate competitive differentiation on both sides. Google has greater scale and a technical lead from years invested in Tango. But Apple has more control over the hardware in its classic vertical integration.

In terms of reach, Apple has the short run advantage, based on a more unified hardware and software set that supports wider compatibility with ARkit, But Android will have the longer term scale as compatibility cycles into the much larger android universe.

ARkit and ARCore also carry their parents’ DNA. For Apple, it’s all about apps. For Google, the web. For developers making a platform choice, that means ARCore could reach more users, but ARkit couldbe more monetizable though app revenue models (dowloads, in-app purchases).

There's a lot more to it of course, and you can get a video summary below, and the full report here. Stay tuned for ongoing coverage as ARCore and ARkit continue to evolve and get deployed by mobile AR developers. it will be an exciting time.

Methodology: ARtillry's forecasting involves a unit-penetration model based on cumulative smartphone sales that go back 10 quarters (average replacement cycle), while also factoring in AR compatibility (for example, A9 chips or greater for ARkit, and Android 7.0 for ARCore). 

Bay Area: Join Us at The Confluence Summit (10/27)

VRARA Members, contact us for a discount code

Screen Shot 2017-10-11 at 11.11.41 AM.png

We're living in a period of technological convergence. And those converging factors notably include immersive technologies like VR and AR. 

With that backdrop, Winston Baker will host the Confluence Summit, happening October 27, in Menlo Park, CA and featuring speakers like YouTube founder Steve Chen. 

VR/AR Association, SF has partnered with the conference, meaning we have discount codes for members to save 30% on registration (contact us).

The conference program is here and you can register here. We hope to see you at the show. 

More from the conference organizers

Since the earliest days of cinema and television, creative storytelling and technology have worked hand in hand to entertain audiences in new and exciting ways. One of the world’s greatest innovators of all time, Thomas Edison invented many devices that greatly influenced the world beyond the light bulb - for instance, the phonograph and the motion picture camera.

In more recent years, the era of VOD and digital distribution have profoundly affected the way in which the content business works. And now, with the proliferation of immersive experiences, both technology and entertainment sectors have a new set of opportunities and challenges to face.

What will the future hold for storytelling on new mediums? How will content be made, accessed and distributed? What devices will consumers adapt to?

As the forces of technology and entertainment continue to both disrupt and influence one another in this ever changing media landscape, how can tech developers and content creators combine efforts to take our generation’s innovative revolution even further? We hope to discover the answers at The Confluence Summit


Learn more about VR/AR SF chapter here

Recap of VRARA NYC Chapter Event ‘Narrative in VR: How to Create Compelling Stories with Virtual Reality’

Screen Shot 2017-10-09 at 8.30.49 AM.png

On a steamy late-September Monday (September 25, 2017), on the first day of Advertising Week New York, the VR/AR Association NYC Chapter hosted a panel with some of the leaders in VR storytelling, ‘Narrative in VR: How to Create Compelling Stories with Virtual Reality,’ at the NYU Tandon Future Lab.

Moderated by Chris Pfaff, the panel featured Brian Seth Hurst, Chief Storytelling and President at StoryTech Immersive; Raheel Khalid, CTO of Verizon envrmnt; Caitlin Burns, founder/CEO of Caitlin Burns & Associates, and Lewis Smithingham, president and partner at 30ninjas. An audience of 35 producers, artists, and students were part of a lively discussion regarding VR’s narrative structures, and how much of today’s VR industry has adapted game design techniques to better deliver moving experiences.

Brian Seth Hurst showed his groundbreaking piece ‘My Brother’s Keeper,’ which premiered on HTC Viveport at Sundance 2017, and was produced for PBS Digital Studios. Currently the most widely distributed VR film to date, ‘My Brother’s Keeper’ was what Hurst detailed as an invention process, as it is the first live action VR film shot at 120 frames per second, and includes innovation such as 180 framing and Bokeh inside the sphere.

Raheel Khalid showed some of Verizon envrmnt’s latest work, its ‘Virtual Sports Bar’ experience, which creates opportunities for multi-user drop-in experiences. He has helped build new tools for multi-user VR experiences that will enable producers and consumers to shape their own VR narratives in real-time.

Caitlin Burns described some of her work on Space Nation, a Helsinki-based organization that uses virtual experiences to train civilians for space travel. She also discussed some of her early learnings in VR storytelling, and how to overcome technological hurdles to maintain narrative focus.

Lewis Smithingham described challenges that he has faced with VR and AR productions, including his work for the ‘Conan O’Brien Show.’

Medical Education in 360 Video and Virtual Reality

By Brandon Birckhead MD, Co-Chair and Betsy Eble, Rafael Grossmann, Adriana Albritton, Participants of the VRARA Digital Health Committee 

Screen Shot 2017-10-07 at 6.37.32 AM.png

In the media, you may have heard “Running a Code” which is a protocol that is used when the patient's heart stops pumping. This procedure is taught in every medical school and is called Advanced Cardiac Life Support (ACLS).   It is usually first taught in a simulation center with robotic patients and several team members. Each person has a different duty during the code including: performing CPR, obtaining vitals, managing airway, drawing up/giving medications, writing down the information of event, and one person to take lead in the situation. When the real thing happens these situations are intense and the room is packed with people.

How 360 degree video could be useful for Medical Education

Healthcare providers could replay their own simulated codes with a system that can help guide them to the correct course of action during the replay of the video. One could add arrows to point to areas that need attention first, i.e. the airway. The program could pause at critical moments to allow for input of options available to provider.  A program could be built to test the student at the end of the replay using a “center of focus” heat map to find what steps they missed during the review. It may not be needed to have customized video for each healthcare staff member in training. A single recording may provide a significant benefit for provider training, however it would be interesting to see an experiment of the comparison.

One way to study this process is by placing the 360-degree camera where the head of the patient would be. Seeing things from the patient perspective can be quite humbling to see how your body language is perceived from their view point. Research has shown that a feeling of empathy can be invoked by a 360 video placed at the users point of view. Also it’s the one of the best positions for a three 360-degree video as the patient is usually in the middle of the room allow full use of 360 degree experience.

Another idea is to place of 2 or more of the inexpensive 360 video cameras would allow the student to change perspective during playback. This could be similar to the 360-video experience created by Intel (True VR) for MLB. Benefits would include the empathy gained from patient placement, a 3rd person view using an overhead camera to view all movement patterns and the placement of equipment staff during the Code and any additional placements that might benefit the student being trained. Adding in bio-feedback sensors to the simulations may help identify procedures, duties, situations of higher stress to the participants such as breaking bad news or assessing a dangerous situation.  Integration of multiple scenarios or scenario branches integrated into a learning management system/single simulation would allow students to interact with the learning module. The students could be given a list of actions that each would start a new simulation.

Another interesting idea is the use of Eye Tracking technology. I've been working with VPS , a company that has a simple looking pair of glasses that track the focal point of view of the user. Integrating the "3d person view", the student would be able to see themselves in  360 camera.  With this technology the analysis of where she was looking during the task, can have immense learning potential.

360 Video Success

One of the most successful startups in the VR industry, StriVR, uses 360 degree technology to train people to perform better at a task. They have had NFL teams, college teams, and major corporation as clients. Quarterbacks are able to go over plays many more times than just on the field. Now they are providing this technology to train Walmart employee.

Athletes become better by practicing. And being able to repeat movements and visualize situations without having to deal with a harsh environment or contact from an opposing team provides a competitive advantage. In the same manner, a person becomes more proficient and more able to master a skill by becoming more familiar with essential tools, viewing different perspectives, and playing diverse roles in a scenario. Practicing and role playing, also, diminish the shock and stress reaction that takes place when first responding to a crisis situation. Interactive experiences provided by 360 video and immersive VR can make learning much more comprehensive and deeply ingrained in your memory.

Interactive VR Experience

There is a company making an interactive VR experience for medical education, BioflightVR. The company has an emergency pediatric program that is similar to scenario stated above.  It would be interesting to see the outcomes of both a 360 degree program and an interactive VR program. I imagine there there is more information gained by assessment and possibly more retention with the interactive VR program. However, a study could find them to be comparable. The advantage 360-video has over interactive VR is time needed to create the experience. If every hospital wanted to have specific simulations for each student it could not be done with the current methods of creating an interactive VR experience. However, 360 degree video equipment has advanced substantially and Medical Simulation Centers have several staff members that can handle video equipment.

There are several questions that could be examined in a research study to evaluate the use of this technology:

1)     How does a 360-video feedback program compare to traditional video feedback for teaching medical students in regards to retention, preference, and assessment?

2)    Does placement of 360 video affect the outcomes when using 360-degree video for feedback (ie from patient perspective vs center of room vs physician lead position)?

a.     Secondary outcome: Is there an increase in empathy for patient with placement at patient perspective?  

3)    If medical students or residents take home 360-degree video for repeated playback over a set period of time does it have an impact on long term memory of protocol (Intervention during 2nd year medical school with assessment in 4th year)?

4)    Are there improved outcomes (retention of knowledge, preference by student, and  assessment by teacher) in using recorded simulations for each student vs a single simulated recording?

We are currently working with the other VRARA Digital Health Committee members to improve on this idea.

 

ARCore & ARkit: The Acceleration of Mobile AR (new report)

Subscribe to ARtillry Insights here.

ARkit Installed Base.png

Mobile AR hit an inflection point in 2017. It started with Facebook's Camera Effects Platform in April, followed by Google's "visual search" and "VPS." Then in June we saw Apple's ARkit, followed by Google's August ARCore launch.

Those last two are perhaps most impactful because of the tools they create for developers to build more advanced AR. Mobile AR can really scale when it's in the hands of developers, and when they're incentivized by an installed base in the hundreds of millions of devices.

We've recently quantified that installed base (ARCore & ARkit), but decided to take it a step further. ARtillry's latest Intelligence Briefing takes a deeper dive on ARCore and ARkit. How do they differ? What happens next? and what does it mean for AR developers?

The report tackles these questions, which you can preview and/or subscribe here. More can be seen below in the report's executive summary. Stay tuned for more analysis in the coming weeks, especially as mobile AR continues to evolve.

Executive Summary

Over the past six months, the tech sector has reined in its initial excitement about glasses-based augmented reality (AR). This includes realigned expectations on the time horizon to consumer ubiquity. But in the meantime, the AR world is keeping busy with another opportunity: mobile AR.

Beyond specs (battery life, field of view, etc.), AR glasses’ detriment is form factor: It needs to be sleek and cheap enough to sway consumers to reconcile a key point of friction: personal style. The bar is set high for anything people are asked to put on their face, as Google Glass taught us.

This concern goes away in enterprise contexts (the topic of another report) but is a sizeable barrier in consumer markets. And we’re a few years from marketable formats. The good news is that the stepping stone — or gateway drug as we like to call it — is mobile AR. And there’s a lot happening.

Going by the numbers, mobile AR’s addressable market isn’t the low-millions of headsets: it’s the 3.2 billion global smartphones today and 4.6 billion by 2020. Those aren’t all AR compatible in terms of optical and processing components, but most will be over the next replacement cycle (2.5 years).

Google’s AR development kit ARCore will become compatible with 3.9 billion global android devices during this time frame, and Apple’s ARkit will reach 673 million iPhones. Both achieve AR through software, utilizing the standard smartphone RGB camera, thus lowering the barrier to “true AR.”

Compared to graphics that simply overlay a scene, true AR infuses graphics that interact with physical objects in dimensionally accurate ways. ARCore and ARKit apply simultaneous localization and mapping (SLAM) through a surface detection approach that doesn’t require advanced optics.

The result is an overall democratization of advanced AR capability. This starts with the massive installed base mentioned above, which in turn incentivizes developers with a larger addressable market. Then the content they create entices more users to engage, enacting a virtuous cycle.

Looking forward, we can expect several AR apps as ARCore and ARKit gain footing. But more impactful will be years of third-party innovation with both SDKs. That could rival in creativity and advancement, the app economy itself, which kicked-off ten years ago with the first iOS SDK.

But several questions remain: How quickly will this happen? What are the pros and cons of each AR toolkit? What will be best practices in building, distributing and marketing AR apps? And what does it all mean from where you sit? These questions are tackled throughout this report.

USA Today Network’s VR Ad Studio has Shown There’s an Audience for VR Ads

USA TODAY NETWORK is a member of the VR/AR Association. 

Join our Advertising Committee here.

The in-house studio has been working with brands to develop VR and 360 branded content and with Nielsen to measure the impact of these new ad experiences.

USA Today Network’s in-house studio, GET Creative, developed 360-degree/virtual reality branded content for Pure Michigan promoting travel and tourism to the state.

USA Today Network’s in-house studio, GET Creative, developed 360-degree/virtual reality branded content for Pure Michigan promoting travel and tourism to the state.

For the USA Today Network, virtual reality and 360-degree content is more than just hype. It’s becoming a revenue driver.

For roughly a year and a half, the Network has been working with advertisers to develop immersive branded content that can be distributed across its media properties.

The effort is spearheaded by the GET Creative unit, which launched in March 2016 as an in-house agency charged with executing projects for advertisers that can be promoted in all media that the USA Today Network operates in — including virtual reality (VR). The Network consists of over 90 local media properties and the flagship USA Today, reaching a combined audience of more than 100 million consumers in the US, according to the company.

GET Creative’s first project was for Honda. The team used VR to promote the carmaker’s ultrafast two-seat Indy race car. The branded VR experience put consumers in the passenger seat of the Indy car, rounding a race track at 200 miles per hour.  That project “showed there’s an audience for VR,” said Kelly Andresen, SVP and head of GET Creative at USA Today Network, in a phone interview.

Most recently, the team launched a 360-degree campaign for Pure Michigan to promote travel and tourism to the state. Viewers can “look around” at the various points of interest profiled in the video.

Experimenting with turnkey VR-specific ad formats & measurement

In addition to branded content opportunities, the Network is experimenting with ad formats designed specifically for VR environments. The first is what the company calls a “cubemercial,” which puts the users inside a room or cube in which advertisers can project videos and other creative assets on all four of the “walls.” The aim is to make this entirely new format turnkey for advertisers by incorporating brands’ existing creative assets.

The Network has partnered with Nielsen to measure the impact of VR on brand metrics. “It’s an amazing medium for advertising,” said Andresen, “likely because it’s so immersive people remember the content and VR has a 2x brand recall compared to TV.”

The studio takes a multiformat approach: creating true VR content that requires a headset like Google Cardboard, Oculus or Samsung Gear VR and 360-degree content that can be distributed across mobile, desktop and app and does not require a headset.

“The multiformat approach expands reach, and we see 360 as a gateway to true VR,” says Andresen.

What’s holding VR advertising back?

Interestingly, scale isn’t what Andresen mentions when asked what needs to happen for VR ads to become mainstream. “We have seen growth in true VR reach and expect to see more with the lower price points and variety of headsets available,” she said.

Instead, she named two critical things that still need to happen for VR advertising to truly take off:

We need new words.

First and foremost, says Andresen, is the need to establish a common lexicon for VR. There is no way to describe a “shot list” and story line to a client, for example, and the point of view for the story line now depends on the user’s frame of reference. The entire industry — producers, story tellers, clients, agencies — needs to be able to communicate.

We need a standard that can scale.

Second is the need for standards for VR ads and one experience that can scale. “All of us are challenged to really think creatively here. I wouldn’t want us to default to things like pre-roll. That’s not a great experience and we know this. An intrusive ad experience in VR is particularly bad because users have nowhere to go. . . Product placement is an interesting approach, but there is a challenge for scale because it’s so specific to the context,” says Andresen. “Our first foray borrowed heavily from linear video, but we see more [opportunity]. Can we move to a standard that’s scalable? Branded content has been a solution in that has filled a void.”

Google introduced an early VR ad concept this summer, and startups like Immersv and Outlyer Technologies are early entrants working on VR advertising.

There is no lack of client interest, says Andresen. Initially, clients are seeking education on the size of the opportunity and capabilities. Budgets are different for every client, with some pulling from video budget, some have an innovation budget, and some — she points to Lexus’s work with Saatchi & Saatchi’s Team One agency — have already made the investment in VR and are just looking for scale.

For its part, Andresen says, GET Creative is in position togrow adoption of this new medium that could end up changing our perceptions of advertising.

Original article

What's the Role of Interactivity in VR Game Development? 

In anticipation for the upcoming VRX Conference in San Francisco (Dec 7-8), VR Intelligence and the UK Dept of Trade & Investment have collaborated on a new report. 

Entitled The Role of Interactivity in VR Game Development, the free report  provides an in-depth look at the challenges and opportunities of interactivity and creating presence in VR game development and virtual worlds.

It features contributions from:

  • Tam Armstrong, Studio Director, Polyarc
  • Pablo Fraile, Director of Ecosystems, ARM
  • Rob Whitehead, CTO, Improbable
  • Jennifer Chavarria, Head of Studios, Kite & Lightning
  • Simon Harris, Executive Producer, Supermassive Games

The report also explores

  • Why interactivity is key to augmenting presence in VR – and how to do it best
  • What can be done to drive increases in character and object interaction and how to take advantage of new technology advances, like eye and motion tracking?
  • Perspectives on how important audio is in enhancing interactivity and presence in VR

The free report can be downloaded here, and VR/AR Association members can contact us for a 15 percent discount to the VRX Conference

We hope to see you there.

Recap of VRARA Expert Panel at Ad Week New York: What do Agencies Need to Know about VR #AWNewYork

VRARA NYC.jpg

On September 25, the VR/AR Association NYC Chapter hosted a panel during Advertising Week New York of leading advertisers working in VR/AR to spark discussion about virtual and augmented reality and possible use cases in branded work. Moderated by Friends With Holograms founder Cortney Harding (Co-Chair of the VRARA Advertising Committee), the panel featured Rori Duboff of Accenture, Robert Lester of Glow, Doug Barr of Havas, and Christine Lane of McCann.

 
 

To kick the panel off, everyone was asked about the worst VR/AR experience they’d ever seen, and what lessons could be learned from the mistakes those creators made. Lester pointed to a narrative piece that missed the mark with scale, where the user constantly changed perspectives and sizes, while Lane pointed to a more common example of a nature piece that succumbed to subpar CG. DuBoff also referenced a piece that got scale wrong, and Barr pointed out common problems with locomotion, especially pieces set on roller coasters, which can sometimes make viewers sick.

While all of the problems present in these pieces are fairly easy to solve, they highlight another dilemma -- how to make sure agencies and brands don’t try one piece, have it fail, and decide that VR and AR are not worth exploring further. Barr referenced a situation where he helped a competitor fix a piece and while his client wasn’t happy about, he believes that a rising tide will lift all boats, and the more good work is on the market, the better. Lester pointed to the need for internal evangelists at agencies and brands, and DuBoff and Lane both said that advocates need to share both good and bad use cases in order for clients to learn and focus.

In terms of moving forward, all the panelists had specific new developments they were excited about, ranging standalone headsets that are rumored to be coming soon to multiplayer experiences to webVR and social VR. In terms of spreading the technology, Barr pointed out that VR has a faster adoption rate than several other well-known devices, while other cautioned patience and the need to use it in training applications to get it to users.

Augmented reality is still fairly new for the most part, as ARKit only recently launched, but all the panelists were bullish on how it would be adopted by brands. Use cases ranged from practical (tape measures, furniture placement) to informational (the MLB app that allows users to view more stats) to the ridiculous (an app that allows users to paint on the world).

The conversation then shifted to the metrics brands care about when evaluating VR and AR pieces, and the need to create stickier content that users will return to again and again. Panelists pointed out the heatmaps are a great way for brands to measure how long viewers looked at something and tweak campaigns to respond to that data. Finally, the panelists were generally behind giving away cheaper headsets as a gateway to VR, even though the quality might not be ideal.

The event was the first in what will hopefully be a series of conversations presented by the VRARA and Friends With Holograms to inform brands about the possibilities and best practices for VR and AR. 

Advertisers considered VR/AR as their number second focus in 2017 for digital marketing technologies:

 
 

We invite all to join the discussion as part of our Advertising Committee, join here.

ARCore Will Reach 3.6 Billion Phones by 2020

For a deeper dive on VR/AR trends & data, subscribe to ARtillry Insights.

ARCore Installed Base.png

Just like Apple’s June ARkit launch, Google’s recent ARCore unveiling has bred lots of interest its addressable market. And just like with ARkit, ARtillry has applied best practices in market sizing and forecasting to pinpoint that figure.

The verdict: There are 26.5 million ARCore-compatible  phones today, growing to 71.5 million by the end of 2017. Based on the size of the Android universe, this will quickly accelerate over the next few years, reaching 3.6 billion units (92 percent Android coverage) by 2020.

How did they arrive on these figures? The starting point is ARCore’s current compatibility, limited to Google Pixel and Samsung Galaxy S8, running Android 7.0 (Nougat) or greater. Looking at cumulative sales figures for both devices, we’re at roughly 26.5 million total units in market.

But that’s the easy part. The hard part is projecting forward. Based largely on the size of the overall Android installed base — 2.9 billion global devices today, growing to 3.8 billion by 2020 — number crunching ensued. One key forecast input is upgrade cycles in the Android Universe.

About 16 percent of Android devices usually run OS versions released in the prior year, while 32 percent run versions older than one year, 29 percent older than two years, 15 percent older than three years, and 8 percent older than four years (hence ARCore’s lack of full coverage in 2020).

Adroid & iOS installed base.png

There was also a major hint from Google: The company noted in ARCore’ introduction that it’s working towards a goal of 100 million compatible phones — including Huawei, Asus and LG — when the platform launches. It didn’t provide a date but we’re predicting Q1 2018.

Stepping back, one takeaway is that ARkit has a slight advantage in being first to market with a head start in developers’ invested time. But the lifespan of AR will eventually diminish Apple’s three-month head start. Greater developer attraction will ultimately come from platform reach.

Apple also has a near-term lead in the installed base of ARkit-compatible iPhones (380M), but one hardware replacement cycle (2.5 years) will give most smartphones AR-compatible optics and processing. And the Android universe exceeds iOS, by more than two billion units.

Other points of differentiation come down each platform’s approach for delivering AR. Apple’s DNA is an app based framework, while Google’s web-based DNA will be reflected in its use of Web AR. The latter could have less friction (in addition to more scale), as we’ve examined.

There is of course a lot more to these competitive dynamics and addressable market sizes, and they'll be included in ARtillry Insights' latest report, available tomorrow. It will take a deeper dive on ARCore and ARkit, and the strategic implications for everyone. Subscribe here.


For a deeper dive on AR & VR insights, subscribe to ARtillry Insights. See ARtillry's market-sizing and forecasting credentials here. 

Virtual Reality and its Impact on the Field of Criminal Justice

VRARA Criminal Justice Committee Seminar Pictured Left to Right: Eric Dustin of FARO, Rory Wells, Esq. of Ocean County Prosecutor’s Office, Ed Williams of FARO, Eduardo Neeter of FactualVR, Greg Schofield of Toronto Police Service

VRARA Criminal Justice Committee Seminar Pictured Left to Right: Eric Dustin of FARO, Rory Wells, Esq. of Ocean County Prosecutor’s Office, Ed Williams of FARO, Eduardo Neeter of FactualVR, Greg Schofield of Toronto Police Service

Jersey City, New Jersey – Multiple law enforcement agencies, academics, start-ups, non-profits and corporations from the United States and Canada met today in Jersey City for a first of its kind seminar and discussion on the impact of Virtual Reality and Augmented Reality on the Criminal Justice System.  

The Virtual Reality/Augmented Reality Association’s (VRARA) Criminal Justice Committee held its first event today on current and future applications of virtual reality technology, where an individual or group of individuals are immersed in a 3D experience using headsets or glasses.

The meeting covered demonstrations of the latest technology, including laser scanners and VR applications from event co-sponsors FARO Technologies and FactualVR whose developing technology allows to accurately replicate and communicate the facts around crime scenes to aid in investigations, preservation and future testimony in court.     

Co-Chairs of the committee, Assistant Prosecutor Rory Wells and Eduardo Neeter, Principal of FactualVR both addressed the attendees with valuable input.  The topics ranged from training and investigations, to the use of VR at trial and the use of VR for rehabilitation/reentry after serving time in prison.   

“It’s not a matter of “if” but “when” as the technology continues to develop and become mainstream, people will eventually demand that VR be used in every courtroom” stated Co-Chair Eduardo Neeter.

For information on the committee or future events, please email us at: info@thevrara.com.

www.thevrara.com

VRARA SF Chapter Event: The Art and Science of Lightfields (video)

IMG_8548.JPG

What are lightfields and why are they important? This is a question that VRARA SF got to tackle last week by interviewing Lytro at our Fall chapter event (video below).

The practice of lightfields is split between capture (cameras and lens arrays) and display (Holographic panels and VR headsets). Lytro is the industry leader in capture, with a range consumer to professional/cinematic cameras. It's latest work is Hallelujia produced with Within.

In fact, our discussion with Lytro came one week before Hallelujia's public release. You can see that here, making this week's Friday video a double feature. The full Hallelujia production involves 6DOF positional tracking, but this more portable version offers 3DOF head tracking.

As for our discussion with Lytro, the company's insights on lightfield capture have been formed over a long tenure in the field. Starting with refocusing cameras, it evolved into cinematic rigs, and on to today's flagship Immerge camera, which was used in the Hallelujia production.

One thing evident from the discussion is the degree that lightfield capture blends optical and digital technologies, making the field quite advanced. Some of the challenges on the digital side will recede as Moore's Law improves processing and compression of lightfield data.

But at the same time, Moore's Law doesn't govern optical technologies. Notice how camera lenses haven't changed much while other consumer tech like smartphones and flat screen TVs have streamlined. Optical equipment is generally the size it needs to be to do its job.

So while the digital barriers of lightfield capture will get smaller with Moore's law, some of the camera rigs and lens arrays required to capture lightfields will remain large (not a bad thing). For example, the degree of parallax you can achieve from a lens array is a function of its diameter.

Lightfields also aren't a silver bullet for AR & VR. Polygon approaches (graphics) are prevalent today in volumetric VR, and will continue to be advantageous for interactivity (games, etc.). But lightfields will excel for playback of photorealistic or cinematic content. They can work together.

"We absolutely see them coexisting," said Lytro Senior Architect Colvin Pitts. "It's a little bit tricky to manipulate lightfield data in real time, but combining the two is absolutely on the table."

See the full interview below with Pitts and Lytro Director of Engineering for VR, Alex Song. And stay tuned for more VRARA SF Chapter events in the coming months.

VRARA VR AR Panels during Advertising Week New York - RSVP Now!

RSVP now as tickets are limited!

Screen Shot 2017-09-22 at 9.51.43 AM.png

Narrative in VR: How to Create Compelling Stories with Virtual Reality

Panelists from Verizon, Immersive Storytelling, Caitlin Burns, 30 Ninjas. Moderated by Chris Pfaff.

RSVP here

What Do Agencies Need to Know About Virtual Reality

 

Panelists from Accenture Interactive, McCann New York, GLOW Digital Agency, Havas Health. Moderated by Cortney Harding.

RSVP here

VR/AR Association Releases Vancouver Ecosystem Infographic: 130+ VR/AR Companies and Growing

 
infograph_12.png
 

130+ Companies and Growing!

(Thursday Sept 21, 2017) - Vancouver is a city in the heart of the Pacific Northwest, renowned around the world for its majestic mountains, pristine waters and stunning beaches. Recently, Vancouver’s also been making a name for itself for another reason. The city has become a global hub for VR/AR/MR and is home to 130+ innovative companies. in the space. These companies are solving problems, creating immersive storytelling experiences, educating and building the next wave of computing. 

A 40-plus-year legacy in film & television production. More than 30 years of cutting-edge VFX & animation. World-class games & mobile entertainment cluster over two decades in the making, a strong tech scene and one of the top startup cities in the world.

Click here to read the full report

Watch our VRARA SF Event in 360

IMG_8544.JPG

Last week, VRARA SF held a quarterly chapter event focused on lightfields. What are lightfields and why are they important to immersive media like AR & VR? These were the questions we tackled during the event. 

For more detail, you can watch the entire video below. And to show that we're walking the walk with VR, we've captured the event in 360. Watch it embedded below or view in cardboard mode (stereoscopic, head tracking, etc.) in a compatible headset.

We'll have more coverage soon, including individual session video (traditional 2D) with soundboard audio. Until then, special thanks to all of our sponsors and speakers

Bay Area: Join us in December at VRX 2017

Contact us for a discount code. Super early bird ends Friday

In VRARA's continued partnerships throughout the event world, the latest is VRX 2017. Taking place 12/7 & 12/8 in San Francisco it features heavy-hitter speakers from HTC to Audi. 

With a theme grounded in VR's business growth and cultural immersion, it promises lots of concrete learnings, not to mention networking with industry influencers. 

VRARA SF is proud to be a media partner, meaning we'll not only be there but members can receive a 15% discount code to attend. Contact us if interested. We hope to see you there.

More from the event organizers:

Now in its 3rd year, VRX has established itself as the world’s premier gathering of senior-level virtual reality business leaders. Previous speakers include a global who's-who of those pioneering the way in VR, with CEO’s and senior decision makers making waves in gaming, film, enterprise, broadcasting, healthcare, education and more. Expect more of the same this year, with more to be announced.

 

 

How Will Advanced Display Technology Drive VR/AR Adoption

By Chris Chinnock, Insight Media chris@insightmedia.info

DisplaySummit-Dates-Location_640.jpg

That is one of the key questions that will be addressed at the Streaming Media for Field of Light Displays (SMFoLD) workshop to be held on Oct. 3 followed by Display Summit on Oct. 4-5.  Both will be held in Sterling, Virginia and will feature leading technologists discussing the state and future of immersive displays and the infrastructure needed to deliver these compelling images.

NOTE: VRAR Association members get a 20% discount on Display Summit registration. Email info@thevrara.com for your promo code.  Display Summit registration includes access to SMFoLD workshop as well.

At Display Summit, we will be looking at component technology, headset designs and application requirements for AR and VR.  Most of the focus will be on non-consumer applications and trying to understand how advancements in consumer-facing products and technology will enable other professional and commercial applications.

For example, one of the promising new display technologies for VR/AR applications is microLEDs.  For this application, most are focusing on a monolith approach where a wafer of blue GaN LEDs is bonded to a CMOS active-matrix backplane to drive individual emitters. These LED arrays are high density with small (3-10 micron) emitters.  The challenge is that only blue light is created with this process, so some sort of color conversion technology is needed to get to full color.  Phosphor conversion technology won’t work as these particles are 100x to 1000X bigger than the emitter and no good green phosphor exists today.  So how do you solve this problem?

At Display Summit, VeraLase will describe their approach to color conversion called Chromover that combines photoluminescent quantum wells with a novel resonator to enable bright, efficient full color microLED microdisplays.

Another approach will be described by Nanosys.  They are developing quantum dot materials that absorb blue light and reemit in the green or red.  His talk will discuss the requirements for quantum dot color converters for microLED displays and the current development status of quantum dots for this application.

Market analysts from Yole will also be there to provide an overview of technology trends with microLEDs along with insight into their adoption in a number of applications including AR/VR.

Most agree that current image quality needs to be improved significantly in AR/VR headset designs.  Factors that impact the design include resolution per eye, field of view, latency, frame rate, color gamut, motion and other artifacts, etc.  While creation of high quality headsets is mostly possible, the size, weight, power, ergonomics and cost would not be suitable for most applications.

To address these trade-off challenges, VR/AR industry guru Karl Guttag will focus on wide field of view and high angular resolution headset design.  He will provide an analysis of this design space and the trade-offs we have to make today, and what new technologies should allow in the near future.

One of the key components of AR/VR systems is the waveguide optic.  This device typically features a holographically-defined input optic to capture the image from a microdisplay and allow it to propagate inside the waveguide.  To extract the image and present it to the user, more holographically-defined optics are used.  Achieving full color and wide field of view can be a challenge, so how can this be addressed?

Luminit will provide a nice overview of the design principles of these holographic optical elements (HOEs) and provide some insight into to how their company manufactures them and their performance characteristics.

Digilens will likewise describe their approach to full color wide FOV waveguide design and profile their current use in Head-Up Displays and future use in AR/VR headsets.

For those who actually design and build AR/VR headsets, the ability to characterize performance is always a challenge. Few standardized methods for device characterization exist and few metrology companies support this emerging area.  Fortunately, Radiant Vision Systems is focusing on this area and will provide some insight into the tools they have developed to characterize optical performance of dense, high resolution microdisplays.

We are also seeing a huge desire to use AR and VR technology in many commercial, military and professional applications where training is required.  The use of AR and VR is being explored vigorously in many industries, so adapting the tools and technology from consumer-facing products can be a cost effective way to move forward.

But the needs and challenges of these non-consumer applications are different.  For example, enterprises need to be able to train personnel locally or remotely in a VR or AR environment.  EON Reality has stepped up to this need by developing their EON Enterprise Virtual trainer.  

This solution provides a unique 3D virtual training collaborative environment that allows a Trainer to train ‘students’ either locally or remotely.  The Trainer starts a ‘Lesson’ that consists of a 3D virtual model within a 3D virtual environment, the Student via a Head Mounted Display (HMD) is immersed in this virtual environment and receives either direct instructions via VOIP from the Trainer or instructions/prompts from within the 3D environment or a combination of both.

But hardware is not the full solution of course.  Careful choice of display content is very important to provide tangible operational value to wearers of these systems. Synergy and compatibility with other platform displays is another very important design factor.  Rockwell Collins will discuss these issues in the context of AR displays for training.

Another challenge is the fast pace of innovation in the AR/VR market.  Many non-consumer applications need solutions that will last for years and relying on consumer products means products, parts and support may not be available over this time period.  On the other hand, developers want to be able to take advantage of recent upgrades in technology.  So how do you solve this dilemma?

One way is via standards efforts such as OpenXR with middleware platforms providing solutions today.  Sensics CEO Yuval Boger will describe the problem of future-proofing training systems, review existing solutions for this problem and describe ongoing efforts.  He will conclude by describing effective strategies to keep training systems current.

Finally, Rockwell Collins will describe their Integrated Digital Vision System (IDVS).  It is an advanced combat helmet mounted display system for warfighters that combines real-time mission data with multispectral vision sensors into one view for enhanced situational awareness.

The IDVS sensors include two low light level Visible/Near-InfraRed (VisNIR) sensors for binocular night vision, as well as a single Long-Wave InfraRed (LWIR) sensor for thermal imagery. On-board processing fuses the sensor video with incoming data from various sources (such as a command center, other warfighters or UAS) for low (less than 5ms) latency augmented vision, day or night.

The first prototypes utilized two high resolution OLED micro-displays with see-though free-form prisms for near-eye display. The next generation IDVS will incorporate high definition waveguide displays for better see-through quality and higher brightness.

SMFoLD web banner.jpg

The SMFoLD workshop is designed to provide an overview of the light field ecosystem from content creation/generation through distribution to advanced 3D displays.  These displays can and will be large theater-sized, desktop monitor scale, or mobile phone and AR/VR headset types.  The workshop will also focus on the formatting, signaling and encoding of light field data for efficient distribution over networks.

Such a streaming standard could be useful for the distribution of all kinds of large data sets.  This can include VR/entertainment files, medical data (CT, MRI), CAD data, geophysical and photogrammetry information, point cloud data, SAR data and much more.  The data sources are out there, but the advanced visualization systems and delivery mechanisms need work.  This workshop is designed to advance this discussion.

More information on Display Summit and Stream Media for Field of Light Displays can be found at:

http://www.displaysummit.com/2017_display_summit/

http://www.smfold.org/2017-smfold-workshop/

Contact: Chris Chinnock, Insight Media, 203-831-8464, chris@insightmedia.info

World’s First Mixed Reality Arcade Opens in Stamford, CT

Featuring Microsoft’s Hololens devices, The Holodeck is the first of its kind, bringing true Holographic and Virtual Gaming to the general public. Located in the Stamford Town Center, it will have not only Hololens Holographic Games, but HTC Vive and good ‘ol fashion Xbox Stations as well. “There is something for everyone from the casual to the hardcore Xbox gamer here at The Holodeck”, says Todd Fuchs, Founder and Chief Innovator.

While most Entertainment Venues are focused on VR and PC Gaming, we wanted to focus on technologies that provided a virtual experience blended with the real world in order to provide a truly and physically social atmosphere where gamers weren’t isolated in their own worlds. The Hololens was the natural choice, as we believe the future of arcades, who’s very nature is location-based gaming with others, must be Mixed and not just Virtual reality.
— Todd Fuchs

Why not just Arcade? The Holodeck feels that the term Arcade is a bit outdated. So instead it calls itself an Innovation Center, and for good reason; not only can the general public come and experience MR, VR and Xbox Gaming, but independent game developers can rent time in one of the several on-site “DevPods”. Here they can use The Holodeck’s equipment to test out and develop their ideas on real hardware devices that they may not otherwise be able to afford. The Holodeck also offers on-site, full-time developers of its own, providing access to live help for any DevPod Guests.

“We are extremely excited to be living in an age where Guests can play virtual games based in the real world, collaboratively or against each other while sharing the same physical space and enjoying all the perks that arcades of yesterday provided to us growing up. We believe that not only is now the perfect time, but we want to be on the forefront of helping others realize this same future, by providing access to the tools and hardware necessary. And for Guests that just want to be part of a unique lounge-like setting and play some VR or Xbox, we’ve got that too.”

The Holodeck has several options for the experiences ranging from paying by the half hour, up to a monthly Membership full of perks and Member-only Events. They have certainly tried to think of everyone and provide a world-class upscale environment to match their mission. You won’t find any folding chairs, junk food or dirty rugs here. Even the Staff are dressed in nice black attire covered by futuristic white lab coats. “We want every Guest to feel as though they are a VIP, a king or queen. We take pride in offering cutting-edge gaming paired with an unmatched level of service, in a clean, up-scale environment dedicated to true gamers.”

The Holodeck opened in July  2017 at The Stamford Town Center directly next to the GameStop, 100 Greylock Pl Stamford, CT 06901

Website

Meetup page

Facebook page

Instagram 

For more information contact media@theholodeck.com

Media Coverage: 

Click to read: "Arcade Owner Making His Dream A (Virtual) Reality"

With so much great success in a few short months, we are spawning a second, flagship, location at the Palisades Center in West Nyack, NY! It’s opening Oct 9th
— Todd Fuchs