top of page
  • anatolyjm

Teaching Augmented Reality: Making More Than Novelty

PART 1

Collaborative Teaching for a Collaborative Medium


In 2019, my University of Maryland colleague, Daniel Brown, and I sat down to come up with a class we could co-teach in the soon to launch Immersive Media Design (IMD) program. The IMD program would be unique; for one, it would focus specifically on immersive and interactive media like interactive projection design, motion capture, and virtual and augmented reality; and two, it would be part of both the department of computer science and the department of art. Students would go through the program specializing in either design or development, but would be cross-trained in both.


Daniel is a Computer Science graduate student with a background as a 3D and environment artist on AAA games and also mixed reality experiences as part of the Microsoft Hololens team. I come from a performing arts background, having created, produced, and performed in a wide range of experiences from Broadway plays to live performances in virtual reality, to documentaries, to immersive installations.


The initial concept for the class was to experiment with a teaching and learning environment that leveraged the power of teams. We both recognized that almost all compelling experiences in augmented and virtual reality are created by teams with mixed skills, experience, and backgrounds upon which to draw. They work in a highly iterative manner, constantly testing concepts and approaches and seeing what works and what fails. We would co-teach the class, modeling the kind of cross-disciplinary collaboration that we wanted our students to engage in. Daniel would lead discussions on development and prototyping, I would lead discussions on ideation and project management, and we would both lead conversations on the design process and documentation.


Untapped Potential

We decided to focus the class on creating experiences for augmented reality (AR) specifically, calling it “Augmented Reality for Creatives and Coders.” In 2019, and still largely true as I'm writing this in 2023, augmented reality's potential for creating meaningful, necessary, and compelling experiences has gone largely untapped. Especially when compared to the content being created for virtual reality. AR has found utility in a few areas like novel marketing gimmicks and tools for picking furniture as well as heads up displays for factory workers and first responders, but its potential to incorporate story and narrative have been largely unexplored. Why is this? How could we approach making experiences utilizing AR differently? The class then would become a cross between a laboratory and a design studio, a space where both the instructors and the students could test different methodology around creating AR experiences and see what could be built.


It was an ambitious goal: a class that would teach students with little exposure to augmented reality and the primary software used to build it (namely Unity) both the technical and conceptual skills necessary to make fully-realized AR experiences. We would focus on process, but by the end of the semester, expect student teams to produce working demos and present them to an invited audience. It was the equivalent of bringing in a group of students, introducing them to this thing called "motion pictures," teaching them the most rudimentary basics of how to shoot content, and then expecting them to make short films by the end of the semester. And all of this before the advent of modern workflows and software that streamline the creation, capture, editing, and distribution of content, so motion picture creation circa 1910.


Like pioneers in the early days of film, Daniel and I saw the huge unrealized potential of augmented reality to allow us to see the physical world around us in new ways, to create novel interactions between digital and physical places, objects, and people. Unlike virtual reality, almost everyone already has an augmented reality enabled device in their pocket: their smartphone. And, in the foreseeable future, augmented reality enabled wearables might become as ubiquitous and integral to our day to day lives as smartphones are today.



But in a very real way, the medium and technology used in the class were secondary to the collaboration skills we hoped the students would acquire. By putting coders together with creatives, and training them on how to communicate and build something challenging and novel together, we hoped to give them fundamental "soft skills" that would serve them throughout their careers no matter where they decided to work.


We expected our students to get stuck or lost at times as they grappled with AR, and honestly, we expected the same with our teaching model. We were betting that by inviting the students to figure out how best to work with AR alongside us, rather than presenting ourselves as having all of the answers, we could encourage richer discussion, research, and ownership over the learning process. Of course, if the structure was too loose or if the expectations were set too high, there was a good chance that the students would check out or give up.


We were all set to get in a classroom and test out our road map for the course. Then the pandemic hit. What started as a highly ambitious proposal to begin with would now have to be taught over Zoom.


PART 2

Getting Started

It's January of 2021. I'm asking myself a few questions: What happens if I get COVID? What happens if my students get COVID? How do you teach a new class about a new medium (augmented reality) in another new-ish medium (remote learning)? The first day of class begins and there we are, little faces inside 2D boxes, or just black 2D boxes in the case of many of my students who chose to leave their cameras off. I discovered this quickly about teaching over Zoom: you learn your students' names fast and never forget them, but you may never recognize them if you pass them in a hallway!


Initially, my co-instructor, Daniel Brown and I had planned to focus our exploration of augmented reality (AR) design and development around creating location-based experiences. But now that the class was fully remote, we, like the rest of the world, needed to pivot a bit. Instead of AR being used as a part of a physical experience in a specific place, we focused on incorporating AR into mobile apps more generally. Though the medium and context were now a little different, we hoped that most of the underlying principles of the class could stay the same.


First Steps



We decided to focus most of our roughly five hours a week of class time to introducing the students to design and project management principles. As for the use of Unity and the technical information necessary to actually build AR apps, while we spent a little bit of time in class introducing the students to the technical tools in that first year, we mostly pointed them towards existing documentation and tutorials publicly available online. It would mostly be up to the students to spend time exploring the software between classes. (More on how this choice played out in a bit.)


The students in our class came from many parts of campus. We had a good number of computer science and engineering students who were curious to explore AR and try a class that was more creative and design focused. Conversely, there were art students interested in learning how to use game engines, and students from journalism, psychology, and other majors who were just generally curious about AR.


Research assignment slide by Boluwatito Ope

With such a wide range of skills, experience, and exposure to AR, we began by discussing

what augmented reality is and how it is currently being used. For their first assignment, students researched an AR experience that they found interesting to share with the class.


In the first week, we also introduced them to core principles of narrative design and immersive storytelling. We wanted our students to approach the creation of their apps as experience designers, prioritizing the user experience and user journey over tech demos and novelty. We certainly welcomed apps that would provide utility and solve real-world problems, but with this course we wanted to challenge the students to make more than tools.


Ideation and Proposals

I broke narrative design elements down into a series of conceptual areas that aimed to help the students make the connection between the content that they would be creating in their apps and more traditional narrative-driven mediums, like film, and especially immersive performance. The categories included story and narrative, character, audience, agency, accessibility, attention, liveness, presence, immersion, interactivity, and environment.




These categories created the start of a shared working language that the students and I could use to talk about the experiences. The goal was to transform group feedback sessions from "I liked this" or "I didn't like this" to specific reflections that articulated what exactly about the experience felt compelling or not. (Ultimately, I found that attempting to get them to stop using “like” and “didn’t like” stifled feedback. Instead I found follow up questions to get them to articulate their feelings with greater specificity was generally more effective.)


Once the students had settled on initial project ideas, we demonstrated how to articulate their ideas in the form of project proposals. The proposals clearly and succinctly lay out the why, what, how, who, where, and if applicable, when of the experience.


The key part of this exercise is identifying the core question that forms the kernel of the experience. The proposals should articulate the theme or concept at the heart of the project and it becomes the question the team returns to as they continue building out the experience. The kernel keeps all of the parts working towards the same larger goal. I ask the students to articulate their project kernels as questions because questions inherently encourage an action of searching or inquiry, rather than a statement which can imply a fixed goal, idea, or outcome.


For instance, you could frame the story of "Romeo and Juliet" as "a cautionary tale about intolerance" or explore the question "can love triumph over intolerance?" The question offers opportunity for debate and multiple interpretations in a way that the statement does not.

Example of a core project question and project description.

Design Documentation

Once the students had formulated their concepts as project proposals, we introduced them to design documentation. Beginning with concept boards, the students learned how design documents are a valuable step towards translating their ideas into content proposals with specific aesthetics, beats of action, interaction, and mechanics. Borrowing mainly from video game and animation development workflows, the students created mood boards, style guides, storyboards, user journeys, and technical design documents. In this way, before writing a line of code or generating 3D assets or animation, the students had crafted a cohesive set of blueprints for what they wanted to create.


Example of a student concept board, experience map, and moodboard. By Delia Parrish.

Prototyping and POC's

Prototyping and proof of concepts (POC) came next. This meant building out an aspect of the app they were making and seeing first if they could build it (the POC) and then crafting a version that others could test to see if they should build it (the prototype). Based on what they learned from this stage, the team might go back to revise their initial project proposal concept or revise their design documents before building another POC or prototype. We stressed the importance of this iterative creation process over a linear process that does not support going back to revise initial project proposals or design docs.

Prototype by Delia Parrish. Proof of concept by Andrea Adler.

Over the course of the semester, the students created 3 group projects along with one individual project. With each project, they were given more time and were asked to refine and revise their concepts, design docs, and prototypes further.


Guest Professionals

In keeping with a desire to connect the work we did in class with the professional world of XR, we brought in guest speakers like Daniel Plemmons, a UX lead at Adobe and Ashley Crowder, the CEO and founder of Vntana, to share their work and best practices with the students. For their midterm group project, I paired each team with a guest professional director or producer working in XR. The guest director presented the students with a project brief or outline, and the students worked together as a mini-design studio to translate the concept into design documents and a prototype.

Ashley Crowder, CEO, Vntana and Daniel Plemmons, UX Lead, Adobe

Portfolio Projects

For their final project, the student teams chose their own subject matter and created working demos of their projects that they then presented to faculty and industry guests at the end of the semester.


Here's a couple examples of projects created by the student teams in the first year of the class.


Chesapeek,” An AR experience that allows users to engage with and learn about wildlife native to the Chesapeake Bay watershed. The experience begins as a site-specific, geo-located walking tour around Lake Artemesia.

"Chesapeek," created by Amy Hoang, Remy Mezebish, Susan L. Wiesner, and Nuruddin Koraym

Lone Light,” an AR installation that casts the audience as a paranormal detective sent to investigate a strange murder.


"Lone Light" created by Aishwarya Tare and Mahum Qadeer with Corwin Riordan and Ben Lin

Initial Takeaways

Over the course of that first year, Daniel and I learned an incredible amount about how to teach XR design and development and it reinforced our focus on creative and collaborative process. We learned how useful it was for the students to embrace challenge and failure as a central aspect of creating XR experiences. When the students were given space to choose many of the aspects of the experiences they wanted to build, they were far more motivated to try and fail at building it.


Now, about relying on the students to use online tutorials outside of class time: in the first year of teaching the course, we badly misjudged the support and time that our students needed to work with the core software, in this case Unity and Adobe Aero. Software like Unity is complex and intimidating for new users--just getting installed and running correctly on some of our student's computers took weeks. Frequent updates to the software created version discrepancies; continually out of date documentation often compounded the problem.


Promising AR-specific software like Aero was in beta and came with instability issues and lack of support for multiple OS or mobile platforms. As a result, teams leaned heavily on the one or two students who already knew how to code in C# and were familiar with Unity for the building of prototypes and proof of concepts, rather than everyone on the team feeling confident enough with Unity to contribute directly.


I took in all of this feedback and, like a good XR creator, went back to the drawing board to redesign the course for the next year.


PART 3

Now In Person! Sort of…

Like XR itself, a lot changed about, "Augmented Reality for Creatives and Coders" between 2021 and 2022. For starters, the class would go from being fully remote to fully in-person at the University of Maryland. I was also invited by Cal State Northridge's new Emerging Media Production program to teach the course to some of their 4th year students, most of whom had begun as film majors. This class would be a hybrid model with approximately a third of the classes in person and the rest over Zoom.


My co-teacher at UMD, Daniel Brown, was not available, so I prepared to teach the class solo, spending months strengthening my technical skills and securing a fantastic TA, Computer Science Ph D candidate, Christopher Maxey, to assist the students with coding and Unity questions.


With the classes now in person, I could better focus the course around site-specific and real-time AR experiences. With each project idea that the students put forward, I would ask them how it created a compelling engagement between something in the real world, be it a person, object, or environment. How would their experience encourage the user to see, hear, feel, experience the real world around them in new ways? What kind of constraints or particular design and technical considerations would this require? In short, why did their project need AR?

Concept board by Isabelle Klimanov

Additionally, I encouraged the students to generate experiences that might involve more than just an AR app on a mobile device. They could create a dance performance that used AR as a design element, or create a guided audio experience with elements that used AR. The importance of this distinction was to frame AR as a design feature that could add value to an experience in any number of mediums and use cases.


Concept board by Ian Vinkler

As we discovered from the projects made during the first year of the class, creating rich experiences that rely solely on AR content and interactions is currently difficult and time-consuming. I wanted to challenge the students to think more creatively, holistically, and strategically about the use of AR. (It's also an acknowledgement that few users enjoy an AR experience that requires them to hold up their phones for extended periods of time.)


This second time around, I spent much more time with the students in class at the beginning of the semester to make sure that they had a basic understanding of Unity and how to build AR applications with it. By spending the class time to make sure that each student could build a basic AR scene in Unity on their computer and deploy it to a mobile device, I was able to ensure that each student had obtained a basic level of knowledge about the primary technical workflow.


That said, the students were still expected to spend time outside of class researching different technical solutions based on the needs of their specific project.


As with the previous year, a few students faced frustrating and time-consuming issues with getting Unity to work properly on their computer and then getting app builds to work on their mobile devices. By and large, the struggle provided a useful learning opportunity in problem-solving, but for some, it generated weeks of roadblocks that might have been avoided if the program were able to supply the students with standardized hardware or if the development tools we had chosen were less finicky.


Partnering with Niantic

Over the previous fall of 2021, while attending the Augmented World Expo, I connected with a couple members of Niantic's Developer relations team. They were about to release Lightship, an AR development platform that worked with Unity to create sophisticated AR experiences. It took much of the underlying multi-player and contextual awareness software that had made their games like Pokemon GO such huge successes, and handed it over to outside developers to create their own experiences. I scheduled workshops with both the UMD and CSUN students to learn Lightship and to have the option of using it to create their final projects.


It was incredibly exciting for both the students and myself to work with Niantic's developer relations team on a new, and potentially game-changing toolset for creating AR-enabled experiences.


It embodied so much of what I wanted for the class: introducing the students to companies and professionals working in XR, testing cutting edge-technology, and offering them specialized training that they could not get elsewhere.


Of course, it also came with the challenges of working with a brand new software solution: out-dated, incomplete, or incorrect documentation as well as updates to the SDK that would break existing projects or change established workflows. And unlike more established AR solutions, there was not yet a large community of developers to go to with questions or best practice tips. (In the year since Niantic worked with us, their team has greatly improved Lightship’s documentation and tutorials and they have cultivated an engaged and growing developer/creator community.)


I think our work with Niantic was a great way for the students and the IMD department to build relationships with a potential employer for our students and a leader in the AR experience space. We were also able to give quick and useful feedback about the documentation materials and tools that was appreciated by the Niantic team as they worked to improve the Lightship platform. As neither Christopher nor myself had used the platform, we built a Lightship prototype alongside our students. We were so taken with the platform that we built the beginnings of a multiplayer AR block building game. It won best use of multiplayer in Lightship’s summer 2022 competition.


Paper Prototyping

Another new approach that I took with the students in year two was to take a day to do paper prototyping. I was curious to see how thinking in a tactile way about the experiences they were creating would change their perceptions and approaches. Sometimes working with different materials outside of software can allow us to prototype faster or think about the content and the narrative in new ways. It gives us a lot of quick and useful information about the UX and what’s compelling about it. Additionally, paper prototyping would be a much faster method (for most students) to translate their ideas into spatial content. It would help the students zero in on the affordances and constraints in how they intended to use AR before they spent time building. Also, it would also point out parts of their proposed experience that might be better realized through an approach other than AR.


Paper prototype of a demo AR experience created by Christopher Maxey and myself.

The challenge that I wanted to address was two-fold: one, I wanted my students to think more specifically about the user journeys and narratives they were creating. I observed that they would come up with perfectly interesting ideas but kind of stopped there. Making design documents, like user journeys, had helped, but they needed to "get on their feet" as you would call it in the theater. The second challenge I hoped paper prototyping would address is the time-consuming nature of building AR experiences inside of a game engine. Even small prototypes built in Unity can take a lot of time and expertise to create. Before the students invested that time, I wanted them to have gained as much info as possible as quickly as possible so that they were well prepared to spend the time to build a more resource-intensive prototype.

Paper prototype of a demo AR experience created by Christopher Maxey and myself.

In general, I think it's an incredibly useful exercise, but it has its limitations. On the plus side, it puts the narrative and UX aspects of the experience front and center. The students start to think more specifically and spatially about what their experience does and how the user will engage with it. On the down side, often the students create 2d cutouts or cards to stand in for 3D ones. They use text to communicate in ways that (probably) are not optimized for AR. Plus they end up focusing too much on what will happen in AR. Ironically, working with physical materials to stand in for digital ones in their prototypes de-emphasized their thinking around how their experiences will engage with the physical world--people, objects, environments.


In order to get some additional information from the students about what they gained from the exercise, I had them fill out a two question survey at the end of class:

1) What worked

2) What didn't.


The results were interestingly mixed: For the CSUN students, the paper prototypes were revelatory. Paper prototyping catapulted their projects from fuzzy ideas to fully fleshed out experiences. They created experiential road maps and discovered exciting ways to communicate the core theme of their content. It brought groups together around a clear and specific vision for the project.


Many of the UMD teams, especially those who had already done a decent amount of planning and design for their experience and/or for teams that included students with a high degree of programming experience, found the exercise less helpful. Some students felt that it was a waste of time they could have spent building a digital prototype. Still, others discovered where there were gaps or holes in the user experience that they needed to flesh out.


A/B Testing

Paper prototyping was one of several areas where my experience with the two groups of students differed in interesting ways. As a general rule, I found that teaching in person proved to be far superior for the students' understanding and retention of information, but even more importantly, the students formed much stronger teams when they came together to collaborate in person.


However, teaching technical tutorials was far more difficult from the front of the classroom, rather than from my office workstation. Live tutorials are always a bit of a crap shoot--there's nothing more embarrassing than walking students through a lesson only to find that there's some bizarre bug that you're sure wasn't there when you prepped it at home or a software update has derailed your well-laid plans.


So, for my CSUN students, who spent the first part of the semester fully-remote and the last part of the semester mostly in person, there appeared to be a dramatic difference in their engagement and comprehension of the material. Remotely, it was too easy for them to go through the class without getting something to work correctly or to ask for help. In person, they could more easily ask me or the person next to them for assistance and I could spot students that were stuck but were reluctant to ask for help.


My UMD students came from a number of different areas of the school: some from art and computer science, but others from architecture, psychology, and a grad program in user experience. Bringing them together into effective teams meant balancing skills and experiences carefully, so I picked the teams for each of the projects. What I discovered from their end of semester evaluations was that some students were upset by what felt like an arbitrary assignment of teams. Because I did not give them a sense of my thinking or process I went through to assign the teams, they assumed no thought had gone into them at all, that they had just been thrown together. In reality, I had spent a lot of time considering how to best match students together.


The CSUN students by contrast were all from the same major, and most of them were familiar with each other--though for many it would be the first time they were physically in a classroom together. This meant they began with a communication short-hand and shared experience that ended up producing teams that worked very closely together. Also, I think their experience as film students made the concept of creating an experience for an audience a bit more familiar than the students at UMD.


Process Becomes Product

Both sets of students created largely successful final projects. As with the first year of the class, the students put together a presentation outlining both what they created and a bit about the process they used to arrive at their demo. After the presentations, the guests were able to playtest the final projects.


In general, The UMD projects were more technically sophisticated, but the CSUN projects sometimes were the more interesting applications of the technology. Here's a sampling of some of the projects that the students presented:


EchoStAR,” an interactive projection installation that aims to encourage social interaction and connections between strangers. It uses strategically placed microphones in the room to capture the sounds of the audience as well as motion sensors to capture their movement. It translates these inputs into patterns of abstract color and form. As the audience moves through the space or changes the sounds they make, the installation dynamically responds.


EchoStAR by Delia Parrish, Nikita Rajarajan, and Sinaan Younus

NavigateAR” is an indoor/outdoor campus navigation application that allows users to traverse and discover the University of Maryland in a more efficient and immersive way while also introducing them to campus resources and general points of interest. The AR experience provides highlighted directions that augment their path to their desired destination along with narrated and text directions. It will also encourage exploration of buildings like the Adele H. Stamp Student Union through an AR scavenger hunt led by the UMD mascot, Testudo.

NavigateAR by Jason Fotso, Isabelle Kliminov, and Samuel Deacon

ART!” Is an AR scavenger hunt that turns the Cal State University Northridge campus into an AR student art gallery. The experience uses LiDAR scans of the campus and GPS to place content indoors and outdoors.


ART! by Dylan Brown, Andrea Adler, and Danil Kinziashev

PART 4

Looking Ahead

Like XR itself, it feels like very early days for teaching how to conceptualize and build AR experiences. To me, it seems clear that spatial technologies will have an outsized impact on many areas of our day to day lives, whether it be in entertainment, education, health care, manufacturing, travel, and the list goes on. But for the present, creators and developers are often working with software and hardware tools that are far from optimal for creating content quickly and distributing it to devices.


What is also clear to me as an educator, is that the trailblazing nature of creating XR experiences, as well as its cross-disciplinary nature, offers an excellent opportunity to teach a suite of useful hard and soft skills to students.


Creating AR experiences with an eye to the user experience and narrative journey at the beginning of the process helps to focus teams of artists and developers on why they are building something first and foremost.


Students are forced to wade into the unknown both conceptually as well as technically, and while this can lead to a lot of frustration and painfully slow progress, it offers a somewhat rare opportunity to make content that has never been created before.


Portfolio project presentations for faculty and industry guests

There are relatively few examples of AR content creation best practices or standardized workflows, and this too can mean that students (and instructors) are challenged to create processes that are most efficacious for their particular team and experience.


As important as the technical skills that the students gain in the creation of AR experiences, are the so-called "soft skills" they learn. The process of group ideation, design, project management, prototyping, consensus building, articulation of product goals and process, and problem solving puts the students in scenarios that more closely resemble the kinds of workplace scenarios they are likely to find themselves in than most academic classes. These are skills that will serve them regardless of whether or not they work in technology and give them the perspective and knowledge to create their own content either on their own or with a team.


For me, the class created a laboratory environment to learn more about the general affordances of AR. I've been able to observe how my Gen Z students approach thinking about the technology and what kinds of use cases they find compelling.


I've seen how AR in its current form (mostly deployed via handheld mobile devices) is often most effective as a supportive design element rather than as the sole focus of an experience. My students have shown me how AR can be a powerful tool to rethink our relationship to the physical spaces around us: whether that is through an experience that allows a community to plant a digital garden in a disused parking lot to collectively imagine its potential or how a networked graffiti app could create unexpected collaborations in unexpected locales.


It has also further confirmed my opinion that, like all new technologies, we need to think about how storytelling and narrative can frame the technical possibilities of AR in a way that is relevant and meaningful for the widest possible audience. To me, this includes thinking beyond AR in an entertainment or art context and thinking about how storytelling and narrative apply to AR's application in contexts like health care, education, manufacturing, and social justice. These are areas where AR is making an impact now and where many people first interact with the technology in a sophisticated way. Creating engaging and compelling user experiences in these contexts is critical to anticipating UX/UI in AR for mass adoption in future wearable devices.


Speaking of the future, I hope this article sparks conversations amongst educators and XR professionals about how to attract, support, and educate new designers and developers. Like many others using XR in the classroom, my understanding of the medium and the tools feels far from comprehensive. I’m always seeking out resources to refine and reorient my approach to teaching XR design. If you have ideas about XR education and training or thoughts about this article and the “Augmented Reality Design for Creatives and Coders” course, reach out to me on LinkedIn: https://www.linkedin.com/in/jonathan-david-martin-2012a746/





109 views0 comments
bottom of page