Immerse into Augmented Reality world
Designing a mobile AR experience for students library-
a UX case study
Plan into Action
Introduction
It started with the thought of creating a student library where we can place the virtual models of students' projects in real-world settings that users can interact with as everyone is familiar with handheld devices. This will be a great opportunity for the students to showcase their projects among other students, faculty members, and clients and also to design recruiters in the real world.
I worked on the entire UX process and then later prototyping the features of the app. I also worked on the Visual guidelines for the design language of the app. Later my task was to do the usability test among different users and further improve the design from the feedback.
Background Study
IDoC is an augmented reality students' library app where users can check other students' work and look at their projects in real world where they can interact with the models and see the full-scale model in real-world settings.
Some features of the app includes
-
Augmented Reality based models with marker tracking
-
Stimulation of conceptual thinking and perception enhancement
-
Educational and Visualization purpose
Design Process
Challenge
-
The need to communicate one's ideas the best is a fundamental need for every designer.
-
Being able to visualize and show the vision to others effectively is every designer's dream.
-
To facilitate this need, we thought of developing an intuitive immersive AR-based application that would allow users to interact and better understand students' work through digital interactive models
Mental Model
Mental models are one of the most important concepts in human-computer interaction (HCI). Various challenges confront designers when creating AR apps, one of which being the construction of new mental models.
We construct models based on our related, past experiences: the books we’ve read, the movies we’ve watched, the conversations we’ve had. We build mental models of software much in the same way. Google helps shape the mental model for search. Likewise, Amazon does for e-commerce; eBay does for auctions; twitter does for microblogging; and Microsoft Excel does for spreadsheets. But for AR? The models are yet to be determined.
Likewise, we started creating a mental model for our library app. Guiding a user to scan the marker to view the model in real-time and intuitive buttons for the physical interaction with the model. Also, full-scale view to compare the actual size of the model with real-world elements.
Importance of Augmented Reality
for Library
-
Easy access to information
-
Allows people to open up to new experiences
-
AR has the potential to give the perception of objects to users as they can easily visualize the models more interactively
-
It enhanced convenience as users can see through the model, and explore its internal and external features by simply walking around the models
-
It enhances user’s comfort right from finding information and uploading their work
Phase 1 : Discover
The IDC library needs to be opened by connecting to IITB VPN, then only the user can access the library files. There is one common user ID and password which is circulated amongst the users. A lot of data is quite unorganized and one may find it difficult to access the information needed. And to view the content, the user has to download every file in his directory which is a hassle. Users can see the slides or a presentation that is uploaded on the server.
This overall experience can be improved by providing an application that serves the purpose and can show interactive models of the projects in real time.
Guidelines for AR app
Some guidelines mentioned by Google's AR Core as mentioned​
-
Sound and haptics for a realistic experience
-
Design to promote movement
-
Minimize text and controls in the camera feed
-
Real-world physics and 3D behavior
-
Simple and single-hand gesture
-
Be generous with coaching, instructions and hints
User Research
To gain qualitative data about users' goals, needs, frustrations, and motivations, we conducted short interviews with users that fit the demographics of our research.
The user set consists of a batch of 14 individuals. 8 males and 6 females, between the ages 19-45, all candidates were chosen from our department who access the library for their needs.
Phase 2 : Define
For better interaction and understanding of target users, the user groups were categorized on the type and purpose of interacting with the application.
User Archetypes
User Journey Map
Empathy Mapping
Information Architecture
Here is the task flow for how the user persona’s will interact with the application
Student
Recruiter
Task Flow
Initial Brainstorming
Putting pencil before pixels
Phase 3 : Ideate
We did a lot of brainstorming through sketching before we get into wireframing stage. We followed the crazy 8 sketching technique for various ideations for the AR screens. We also keep aside the notes from user journeys and empathy maps, which really helped in mapping out some additional features in wireframes. Then we compared and discussed the ideas and selected the best ideas among them to put forth into the screens.
Storyboarding
Phase 4 : Prototype
At the prototyping stage, I started by developing mid-fidelity mockups by putting together elements from the ideation wireframes and defining the dimensions of elements in the UI.
Mid-Fidelity Wireframes
Registration Successful
Testing the UI legibility
User Interface Legibility for outdoor environment
I observed that several factors including contrast polarity have an impact on the legibility of UI screens for AR. They summarize the effect of the color-related factors as the color difference was found to play a minor role in legibility under daylight ambient conditions. I found that daylight readability assumes negative contrast polarity will degrade readability.
UI legibility plays a crucial role in the development of the user interface hence users will be using the app in varying lighting. This is why UI elements will focus on contrast. The more contrast the more visible in varying light. Earthy, pastel and desaturated colors are to be avoided. The color palette will focus on bold, saturated, complementary colors to maximize contrast.
Decreased the opacity of buttons so that the opaque buttons won’t give hindrance to the environment
Added an overlay layer in front of camera feed so that the AR objects will be clearly visible in any environment lighting settings
The opacity of the overlay depends on the brightness that is there in the environment
Color Accessibility
To achieve color accessibility in our app, the design decisions we took are
-
Having an overlay screen between the camera feed and screen UI
-
Adding transparency to the UI elements so that the camera feed can be seen easily through the elements
-
Adding enough contrast between foreground and background UI elements
-
Having a glassmorphism effect in AR elements so that environments is visible
-
Keeping the camera feed screen with minimal interactive buttons
Readability of fonts in AR
Although no conclusive research can prove if san serif or serif fonts are more legible than each other. Recent studies have demonstrated that many sans fonts are legible but not readable. Much of it has to do with the letter shapes.
I preferred sans serif font for increased readability in order to enhance the overall user experience within the app.
Iconography
Home
Scan
Profile
Menu
Edit
3D Model
Save
Back
Back from
AR Feed
Close from
AR Feed
Theme
Add
Project
Exploded
view
Interaction
with model
Full Scale
view
Project
Info
Final Screens
Launch and Login Screen
The user can log in via Face detection as well as a fingerprint scanner when the pop-up comes
​
Registration Screens
The user can register using his/her personal mail ID or college ID as well as using a mobile number which will be verified after the verification code
​
Account registered and log in
The user can log in after successfully registering in the app
​
Home Screen
List of Students
of a particular batch
Student's
Projects
Project
Details
Marker Scanning Interface
The user has to open the scanner to scan the marker which will position the 3D model in AR
​
Exploded View
Interaction with
Model
Full scale
Model
Project Info
User Profile Screens
The user can view his projects, bookmarked projects, and can update his/her general information. Users can upload their projects and 3D models.
​
App Development
IDoC application development in UNITY
and extended marker tracing using AR Core
Hands-ON Experience
Phase 5 : Test
At IMXD exhibition in IDC, we tested the app on various user groups. We arranged a brochure consisting of various projects of students with a marker for their 3D models. Also we set up a rig for experiencing the full scale model of a project so that users can walk through it.
Usability Acceptance Testing
App launch
Exploded view
Full scale model
Observations
-
Users were able to scan the marker in place easily
-
They can easily get back to the original position of the model because of extended marker tracking
-
The UI elements were intuitive and the user can easily switch between different view modes of models
-
Some find it difficult to understand the functions of buttons as there was no title text mentioned
-
They really wanted to spend more time engaging with the model
-
Users were able to reach every feature and detail of the model by simply moving toward it
-
The model position gets displaced as some other object comes into the camera feed
-
Users were able to easily read all the data displayed on the screen even under the bright lighting conditions
Improvements
There are many areas to improve in the visual as well as usability and accessibility aspects of the app
-
Having a set of instructions or guides before using can avoid confusion among users
-
Giving a pointer facing towards the model if the user loses it in the camera feed
-
Instead of buttons, we can introduce gestures as it will allow us to remove the buttons from the camera feed
-
We can introduce interaction of the model where the user can sit in one position and use hand gestures to zoom in and rotate the model
-
Avoiding users to move backward as users pay attention to their phone screens and ignore the real world which is very unsafe
-
Giving instant feedback will help a user to that a particular process is happening
-
Giving depth and shadows to models so that it looks real in AR
-
Having pre-set animations for model interaction will make it easier for a user to view the features from a single view
-
Introducing sound effects for every interaction will make it more engaging
Conclusion
Users enjoyed the overall experience of interacting with the models. They found the interface simple and easy to use even in absence of instructions. They were easily able to understand the models and perceive it in the real environment. Solving problems in the app takes a lot of research work and brainstorming.
It was a fun exercise. We learned a lot in the design process.
-
First is working together in a team and coming up with design decisions.
-
Brainstorming for how the app will work, the process, and the steps.
-
Looking for guidelines as we were working on AR for the first time
-
Knowing about the user and his experience using AR.
-
Achieving the legibility of UI elements as the background keeps on changing
-
After test results
Akhil
Angshuman
Amit
Screenshot taken from the IDoC app
I had the responsibility for the whole UI UX process of the app right from the discovery phase to prototyping and usability testing.
Akhil and Angshuman were working on Unity and the coding part of the app.
And Amit was working on 3D models and graphic design for the IDoC brochure.
Thank You
Let's get in touch if you like the process and the overall development of the app
to create more beautiful experiences for the users