Archive: Mar 2019

How Wizards Unite Will Impact AR

What are AR portals and why do they matter? Phil Charnock wonders whether Harry Potter: Wizards Unite will fuel a desire for brands to experiment with this compelling augmented reality technique.

Pre-registration has now opened for Harry Potter: Wizards Unite. This new title from Warner Bros and Niantic – the creators of the record-decimating Pokemon Go – is likely to be released within weeks. The last time Niantic combined geo-location gaming, augmented reality and a major entertainment franchise the impact was felt across games, tech and wider society.

This time around it is the use of AR portals within Niantic’s latest offering that could herald a similar revolution. Why? Because portals are a compelling use of AR that allow you to literally step into the world of the video game. This means you get the location-specific goodness of Pokemon Go combined with the ability to move around a 3D space (almost) as you would in virtual reality. What’s not to like?

The impact of Pokemon Go

OK, let’s recap. Whether you are a fervent enthusiast of Pokemon Go or a naysayer who espouses its ‘demise’ in the years since its sensational launch, the impact made by Niantic’s smash hit is undeniable. As an agency working with augmented reality, Draw & Code have felt the impact of Niantic’s magnum opus on the industry. Suddenly we went from having to explain what augmented reality is to potential clients to finding people approaching us from all walks of life already brimming with ideas of how AR stands to be utilised in their own industries.

There are some big numbers that swirl around Pokemon Go. Over $3 billion revenue generated. More than 800 million app downloads. It reached 100 million users within a few weeks – the television took 75 years to reach that many homes!

Could Wizards Unite have the same impact? It’s tempting to say yes – Niantic have been pushing their AR and geo-location technology further in the intervening years and clearly the market for a game utilising this style of gameplay is now well and truly proven. The flip side is that Harry Potter is not as big a brand as Pokemon, by some measures at least.

What to expect from Wizards Unite

Draw & Code were treated to a little insight into Wizards Unite some months ago at Niantic’s headquarters in San Francisco. At this point the game was taking shape behind closed doors but the glimpse at Niantic’s studio and how they work was enlightening. There is a lot of love going into this game. And a lot of money too.

The challenges of creating geo-location and AR games are not to be underestimated. This is a video game, aimed at the ever-sensitive family audience, that interplays with the real-world – an inherently challenging concept both technically and culturally. It’s no surprise that Wizards Unite is taking its time to emerge from the studio. For example, Niantic will want to avoid some of the negative press around inappropriate real-world locations being a part of the game this time around – and that’s going to take a lot of effort to get right.

What is a Portkey in Wizards Unite?

From the point of view of an augmented reality developer, the inclusion of AR portals in Wizards Unite is getting us excited. Why? Because they are a neat way of bringing extra immersion into a mobile app. You locate a portal – known as a Portkey in Wizards Unite – and physically move inside it as you would step into a room through a door. Then, the player can spin their phone around and see the 3D world all around them. What could be more theatrical than getting you to actually step into a video game environment?

As a matter of fact, Warner Bros have already experimented with AR portals in a tie-in with a book and movie property. For the launch of Ready Player One, the entertainment giant commissioned geo-location app Snatch and Draw & Code to deliver a portal that allowed the user to step into the world of the movie. At the time this was one of the first commercial activations using this technique – expect to see a lot more after Wizards Unite launches.

How may these portals be used by other brands? Aside from games, expect to see experiential marketing and training uses of AR portals are likely to proliferate. It makes sense for place-making apps too – stepping inside a landmark location is a neat way of making the viewer feel a part of the scene.

‘True’ AR is now viable for their audience

Previously Niantic had focused on the geo-location aspect of Pokemon Go with the AR component being an ‘arbitrary’ one rather than something truly contextual or environmentally aware. This was the right approach at the time – it was before sophisticated SLAM (simultaneous localization and mapping) AR was widely available. Now with a plethora of devices able to run either Apple’s ARKit or Android’s ARCore, Niantic have deemed that ‘true’ AR features are now usable by a high proportion of users. If they’ve got their sums right, it’s a big deal for the entire industry as it will create an appetite for more advanced AR experiences.

The industry will be expecting – and will welcome – copycats. The AR portal and other similar features may be about to feature in all kinds of apps, which is good news! It’s a cracking way of engaging with a mobile immersive experience.

If you want to get ahead of Niantic and use AR portals with your brand or to see how other applications of immersive technology can work for you, the wizards at Draw & Code can bring a little XR magic to your door.

Draw & Code are on the lookout for a 3D Artist!

Job Description

The team at Draw & Code are looking for a new 3D Artist to help us create the next great AR/VR games and interactive experiences. This is your chance to work with an ambitious studio that is gaining a global reputation in the burgeoning immersive tech sector.

RESPONSIBILITIES
– Asset creation for environments, props and characters, including look development, modelling, texturing, rigging and animation
– Lighting and rendering
– Working closely with Unity dev team to achieve desired look and quality
– Working to schedule and delivering work to agreed timelines

REQUIREMENTS
– Experience using 3D art creation tools such as Maya
– Experience in 2D software such as Photoshop
– Strong understanding of composition, light and colour
– Knowledge of optimisation processes and rendering techniques
– Strong communication skills
– Solid work ethic and a positive attitude in the face of challenging situations
– Strong time management and organisation skills
– Degree in Computer Animation, Game Art is preferable

BONUS
– Experience with AR / VR / MR
– Experience working with game engines such as Unity
– Knowledge of other 3D visualisation software such as Cinema4D / After Effects
– Experience with projection mapped animation
– Experience of Revit and other BIM tools and software

Although we’re keen on good qualifications, a good work portfolio will certainly steal our hearts.

We are not looking to work with recruiters or recruitment agencies, please don’t contact us.

Candidates can apply by emailing their CV and Portfolio to [email protected]

We Tell Stories – Why XR Needs Diversity

Immersive technology means you are able to place yourself inside a world, not just to peer inside it. Like all of the tech industry, XR features a diversity imbalance – is this burgeoning sector a chance for us to finally build a diverse corner of technology? And is its inherently first-person nature making it all the more important that diversity is pursued? Draw & Code’s Annie O’Toole opines on why the XR industry needs diverse perspectives.

It’s International Women’s Day and I am thinking about stories. Why? Because this morning I was reminded of the eternal Isaac Newton quote that “If I have seen further than others, it is by standing upon the shoulders of giants”. And I began to wonder who and what these giants may be in our world. Our urgent, cutting edge, immersive world.

The legacy of giants is very much alive in the everyday for us. Ada lives in each line of code, Steve in every swipe and Tom in every VR headset. Their work is alive in the hardware and the software that is used by nearly every single person both in our studio and on this planet. A commendable success, a feat that most certainly makes them giants amongst us mere mortals. To put it simply, their work altered our world and ourselves forever.

And with time this legacy of theirs has morphed into something far greater and most certainly far more human than I suppose they ever would have anticipated. Machines and solutions deemed as robotic and missing of human emotions – and human error – are being reimagined into something that really is very human. Their legacy has begun to remould the art of storytelling.

Storytelling is one of the most deep-rooted and empirical aspects of being a human. We are all storytellers; in our minds as we think about the everyday, in the boardroom where we discuss future sales and in our homes as we raise our children, build lasting relationships and gorge on the works of J.K Rowling, Steven King, Charles Dickens, Jane Austin. Even our neighbour’s series of tweets about the big match tell a tale – at the core, these things that we do are all stories. And these stories encompass our visions of the future, opinions on the past and our interaction with the present.

Look at the applications we have devoured; Snapchat tells a story through quickly captured footage, Facebook tells a story through a status, Instagram through an image, Twitter through a microblog. We are all authors in those worlds.

But here in the corner of the tech industry that Draw & Code inhabits, we boast something quite magical; immersive technology. It is the wholehearted belief of many that immersive technology is one of the greatest gifts we have created as storytellers. Immersive experiences, particularly via VR, are about embodiment – you are in the shoes, or at least the eyes, of the protagonist. It’s not distant, there is no cinematographer – this is as close to living the story for yourself as it gets.

Yet, we are serving it an injustice. It is our job as the purveyors of this magic, to tell the story correctly. However, we cannot tell stories properly, fully, unless we have the experiences and the opinions of many. We cannot tell stories properly until we understand the many angles and focal points it’s viewed through. And so we bang on our drum that we need more diversity, but I often wonder if shouting about it is enough?

Surely if people knew how incredible this corner of technology was, they would come and join it? Society is clouded with old judgements and stereotypes, all too quick to assume that technical work is filled with only the things that a man can love, rather than the inspirational and incredible things that actually exist in our day-to-day life in this studio. At Draw & Code I’ve found myself surrounded by and contributing to projects that take audiences back in time to the world inhabited by the Terracotta Army, make toys bursts into life in the palm of your hand, explored interior design solutions for leading retail brands. Immersive technology is in demand right now and it’s a passport to adventures across a multitude of sectors. And it’s the opportunity to work in teams filled with exceptional talent and exceptionally warm hearts – people love what they do in XR.

So we, like much of the XR sector, continue to work on our magic with our male-dominated teams and our solutions that enchant and excite. But how much could the output of the immersive industry be improved by looking beyond the current workforce? And while we will always enthral the tech-savvy, could the products produced by this evolving industry have an even wider appeal if they were coming from teams that represented a bigger audience? Our immersive technology corner has a duty to encompass all and everyone, if at all possible. To awaken the minds and memories of the old, to excite the generation of tomorrow and to alter the thinking of those who can shape the world. Our corner has the potential to make more stories; in the home, in the workplace, on the bus. And, it will.

So on this International Women’s Day I ask each of you to see our corner and our teams, our work and our future not as a robotic and dull. But rather to see them as our way of opening minds, eyes and opportunities to the places that otherwise would seem impossible to access and improbable to exist. Then tell your daughters, your sister, your aunts, your grandmothers, your doctors, your engineers, your teachers, your window cleaners, your shop assistants, yourself. Because we’re not just machines and code, we’re the people who will tell the stories of tomorrow and we want our stories to be whole.

From Virtual to Reality

Draw & Code’s Phil Charnock ponders the impact of our VR interpretation of the Neuron Pod – a daring new building that opens this week in London.

Monday marked the opening of Queen Mary University’s bold and brilliant Neuron Pod building. Years in the making, it seems like a good time to revisit our virtual reality visualisation that may have played a small role in helping get this ambitious project built.

Virtual reality is visceral, arresting, memorable. Like all forms of spatial computing, the sense of presence is magnified and the scale of 3D objects and environments can be accurately represented. As such, it is the ideal medium for conveying the design of new buildings. Indeed, some of Draw & Code’s first immersive projects were collaborations with architects. Here are people who already think with this enhanced sense of presence and scale – spatial computing is made for them.

Screen Shot 2019-03-04 at 10.45.25

Our very first work with immersive technology and architecture was in collaboration with All Design. This practice was founded by the late Will Alsop, one of the most daring and exciting architects there has ever been. The work we contributed too was a nascent project that didn’t make it to the real world, but it did exist as both a virtual and augmented model. The former allowed us to get a 1:1 view of the building, the latter was all about seeing the proposed building in the context of its environment via an elevated vista of the scene. It taught us a lot about the demands of working within architecture, it also brought All Design an understanding of how immersive technology could work for them.

Fast forward to early 2015 and All Design were back in touch with a new challenge. Using the then-new platform of Samsung Gear VR, they wanted to present a wild new design to potential funders, sponsors and other stakeholders. This was Queen Mary University’s Neuron Pod and it was a suitably Allsop-esque design. Despite being so bold (upon showing it to a colleague recently they asked ‘do people actually go inside it?’) it had already gained planning permission. And most of the funding to realise the construction was also in place. Most, not all.

image

Queen Mary University were looking for a funding boost of £900,000 and had organised an event adjacent to their other Will Alsop building – the Centre of the Cell. In advance of this the Draw & Code 3D experts were sent the highly detailed CAD files of the proposed building. As with any model of this nature, when converting into something ready for a real-time game engine the amount of polygons had to be slashed. VR and its stereoscopic view means that another chunk of processing power is being used as the scene is rendered twice on the one device. Then throw in the fact that this is for a mobile device and it’s fair to say that a lot of polygons were going to be shorn!

So would losing polygons and making backdrops from screenshots of Street View (really!) mean a lesser way to view the Neuron Pod? We boarded the Virgin Train to London to showcase the project at the all-important Queen Mary University event, so we were about to find out!

Once at the venue it all felt very familiar. We had spent so much time in VR looking at the space next to the Centre of the Cell where the Neuron Pod was scheduled to be built that we felt a sense of familiarity with the immediate surroundings. That was VR doing its job right there!

Fast forward to the event itself and once the speeches were underway in the grand, modern lecture theatre there came a full, animated render of the Neuron Pod proposal. Everything was bespoke, the detail was sky high, light glinted off panes of glass, people walked through the scene and lens flare was cast from the sun above. Fewer polygons had been lost from the architectural models and there were certainly no screenshots of Street View involved! This looked spectacular and we collectively felt nervous – how could our mobile VR experience compete with this?

An hour later and it was our turn. At the end of the talks the audience were directed to come and try Draw & Code’s VR visualisation of the Neuron Pod. After seeing the superb flythrough we feared the worst – would people welcome a VR experience delivered via a mobile phone? The very first person to try the experience happened to be the artist responsible for the animated visualisation on the big screen. This could be embarrassing. Or so we thought.

MWC 2019

See the photo of the guy looking up and smiling in a VR headset? That’s him. Phew, it had passed that little test. As for the rest of the event, walking away from the demo were a succession of happy, engaged people. It’s fair to say that virtual reality was a hit – just as All Design and Queen Mary University knew it would be.

While the rendered flythroughs were spectacular, immersive technology is in the same stage as when audiences were supposedly fleeing from moving steam trains on the cinema screens. Well, that would be true if it wasn’t for the fact that the idea of audiences running from the cinema was a myth. However, it is anything but a myth that people experience visceral reactions to virtual reality and its related mediums. This engages like no other moving images can.

Did Draw & Code’s VR version of the Neuron Pod contribute to it receiving its funding and eventually going on to open four years later? We’d like to think so – as software developers and animators we are used to shifting things around on a screen, not to seeing our work influence the built environment. Working in this studio may be a dream vocation, but the thought of having a tiny influence on the destiny of a lasting and impactful piece of architecture is a dizzying one, so excuse us if we celebrate the opening of the Neuron Pod.

Six XR Innovations at MWC That ARE NOT Hololens 2

The Hololens 2 dominated the immersive tech headlines from MWC 2019 – but what else was lurking across both the main show and the buzzing startup space 4YFN? We pick six that caught our eye…

NReal Mixed Reality Glasses

The key word here is glasses. Usually mixed reality means you would use the word headset or goggles, but NReal is very different. They may have a name like an early ‘90s rave act, but they are very serious about bringing us closer to consumer MR. After Draw & Code tried them at their reveal at CES, we came across them at MWC in a fetching shade of bright orange – a bold look for a bold concept. Why were NReal at the show? They’ve just secured a sweet $16m investment and a partnership with Qualcomm who will be supplying some of the hardware needed to make the dream of some very smart glasses a reality.

https://www.nreal.ai/

2. Niantic Codename Neon

A joint project between Niantic, Samsung, Deutsche Telecom and MobileEdgeX saw four players using Samsung S10 smartphones as controllers in a multiplayer battle game. A demo for the time being, this 5G-enabled experience brought a realtime, shared AR experience to the MWC showfloor – the first mobile AR game to use edge computing in this way. It reminded the Draw & Code team of our Companion collaborative XR demo from way back in 2014 that saw players in VR, AR and flat-screens able to see each other and interact in a 3D world – a long time before 5G! It was a pleasure to try a demo that tallied so closely with our own ideas of the future of play.

https://www.nianticlabs.com/

3. BroomX MK 360 Projector

Draw & Code have been working on projection mapping since before we were Draw & Code – and the BroomX MK 360 looks set to fill the gap between the bijou Lightform solution and the high-end super-expensive end of the market. While BroomX’s offering has been around for a year or two, it was the first time we got to try it for ourselves. With just about everything you need to create a spectacular shared projection environment in a single device, it was a revelation to us. As with so much immersive technology, to truly appreciate it, you have to see it in action – and after our demo we were smitten.

http://www.broomx.com/mk-player360.php

4. VividQ

A UK startup of brilliant boffins, Vivid Q were situated right by our own SwapBots stand. A prototype headset helped to show their vision – a truly holographic platform. The headset was initially developed as there was nothing else able to display their holograms as intended. Where the real expertise lies is in the processing of point-cloud data to create lightfield-ready imagery. The software underpinning all this could find itself a key component in the advanced 3D headsets that will inevitably appear over the coming years.

https://www.vivid-q.com/

5. Letin AR

The Letin AR PinMR lens is an off-the-shelf solution for smart-glasses optics. Using the magic of mirrors, tiny reflected images are layered onto the glasses. With recent investment, Letin AR could become part of the building blocks of smart glasses – or even full AR headsets – of the future. The demo was crisp, bright, offered 120 degrees field of view – all thanks to pin-hole camera-inspired technology with imbedded mirrors reflecting light back into the viewer’s eyes.

https://letinar.com/

6. Microsoft Azure Kinect DK

OK, we may have said that this roundup isn’t about Microsoft, but we have to mention the Azure Kinect. Announced at the Hololens 2 reveal, it was easy to miss this sophisticated piece of kit in the ensuing excitement. However, we have been craving a comeback for the Kinect after we hacked the previous version to use as a spatial camera for a radical take on the music video and the documentary. Featuring cloud-based AI interpretation of the sensors and their outputs – that includes voice, depth camera, body tracking and more, this has so many possibilities for open-minded developers.

https://azure.microsoft.com/en-us/services/kinect-dk/