Gear Up is a national program that aims to get more high schoolers interested in, applying to, and enrolling in college. In the Idaho chapter, a large portion of time, money, and effort is spent on giving middle school and high school students tours of in-state college campuses. The organization has found that these tours are the best way to garner excitement about going to college. Gear Up Idaho teamed up with the GIMM Works organization (employed by Boise State) to develop an app that could assist in these campus tours.
The main reason this project was contracted, was to solve problems that Gear Up Idaho was currently having on their tours:
Tour guides varied in quality, resulting in varying quality in the tour being given.
Many buildings could not be explored because there were classes in session.
The admissions office was burdened with a high number of students/requests for tours.
Tours were not engaging.
Get students to picture themselves at the college they are touring
Users & Audience
Admissions office, responsible for setting tour guidelines.
Gear Up coordinators, responsible for leading students through the tour and app.
Middle school and high school students, experiencing the tour/app.
Parents, possibly experiencing the tour/app on a weekend solo trip.
Roles & Responsibilities
Anthony Ellertson: Program Supervisor
Olivia Thomas: Project Manager, secondary back-end programmer
Jon Kido: Back-end programming lead
Kayla Wilson: UX researcher & UI design lead
Gabe Solis: Front-end lead, secondary designer
Jessna Rodriguez: Front-end programmer
Ryna Hall: Front-end programmer
Scope & Constraints
The timeline, while substantial in length (16 weeks) was limited by the dev team members’ ability to only commit 10 hours/week on this specific project. All members were balancing other projects within the GIMM Works organization.
The university’s COVID restrictions greatly limited the team's ability to go to the campus. This was significant for us because we were working on an app that is centered around beacon technology.
Covid also limited the schedules of many of our partners (admissions, advisors, students). This made it difficult to connect for feedback without weeks passing between iterations.
Student Surveys Takeaway 1
While the surveys were informative, we struggled to get many students to participate in them. The data should be regarded with careful consideration. Our team decided that while we may use the data to know which features to add, we couldn't let it guide us on what features to remove.
When asked why a student would be interested in college, the response, “To find a community and more people like me” was almost as common as, ‘To learn”. Paired with the fact that many of these students are from towns with less than 10,000 people, we were reminded that for many, college was about so much more than school. To reflect our newfound understanding of these students in the app, we added an idea of ‘highlights’ to our brainstormed feature list. When the student came upon a point of interest or POI (an academic building) there could be ‘highlights’ nested within that POI to showcase extracurriculars like the clubs/organizations that used the space as well.
We hoped this could shine a light on the fact that college was more than just academics.
Student Surveys Takeaway 2
Prior to these surveys, many discussions around the app were held in a purely positive light. In other words, how do we expose the students to as much positivity about the college experience as possible? After seeing 100% of respondents stating that money is a reason they'd avoid college, we acknowledged there were some negatives of college that the app needed to be transparent about. Saying extremely positive things about the college experience only goes so far if the students retain all of their original concerns.
We looked at how we could address their concerns and decided the best way would be to provide links and access to resources that already address those issues. Within the school itself, there is a myriad of resources for understanding financial services. Those resources all made it in the app as a result of this survey.
Potential Features List
In hindsight, I'm still not sure if we built the feature list too soon. In the following research session, you will see that an interview with the advisors greatly altered this feature list. I'm curious if we should have waited until all research was done to make a feature list. Did making the list anchor our beliefs towards certain features? Or was writing out the ideas in our head, helpful for clearing our minds? In the end, we still uncovered the desired features, so maybe it's okay that we made the list before finishing our generative research.
An in-game currency
An AR component - Plant your roots
Collectables such as the digital plants
An AR component -Make your mark
Collectables such as the paint splatter
Points of Interest
Start Tour individually
Join Group Tour
Lead group tour
Fill out the interest survey
Follow a path on the map
Overlay info, facts, and videos on a building
Configure notification and location settings
Recieve notifications when near new POI?
Gamified activities (quizzes, matching, etc.)
Events (for food)
Push notifications (food event)
Admissions can indicate availability
Tour ‘receipts’ or histories for monitoring
Videos of insides of building
After hearing from the coordinators and the students and having a bit of an idea of what we wanted to implement, we then wanted to hear from the advisors. These are the folks who are leading these tours with the students every year. We knew we wanted to hear if they had any issues with these tours and how we could help solve them. While we were expecting a discussion of classroom management, these teachers (as a token of their goodwill) spent most of the time talking about how to help the students, and get the students excited. They were so excited about the tours and it was actually difficult to get them to talk about pain points. This all reminded us just how important that primary goal was of engaging students.
From the discussions held with the advisors, I was able to create a journey map that highlighted the biggest concerns that advisors had on tours.
The biggest thing that came out of our focus group and corresponding Journey Map was undoing an assumption we had made about classroom management. Since we were very feature-happy to start out, we saw our app as solving all problems related to tours. We had originally assumed that advisors would benefit from the app's AI, funneling the students into groups and controlling what they viewed and the experience they had. Speaking with advisors, we had to unlearn this assumption. The advisors wanted classroom management to be in their hands, not in the app's. They didn't like the application keeping track of how long they were at locations or telling them how far of a walk it was to the bus. They wanted each tour to be reactive to what the students were finding most interesting that day. They wanted to be able to follow the rabbit hole of student engagement to wherever it led.
For development, this did mean we would have to plan a little better for this freedom. The less management the app did, the harder it was to coordinate certain aspects of it. For example, if we don't know exactly how long they'll be in a specific place, how do we know how many facts to display on the screen while they were there? Regardless of the new design challenges this brought about, we were interested in and empathetic toward the stakeholders' needs, so we prioritized this new structure.
Comparative & Competitive Analysis
With our comparative & Competitive analysis, we focused on 4 apps that dealt with topics similar to ours. Let’s Roam was gamified, GPSmyCity had points of interest, Geotourist had AR, and CampusTours.com was about college. At this point, we had a good idea of what all the stakeholders wanted so we wanted to see how their wants compared to and competed with other applications.
This was an interesting experience because we had to know when to use the analysis as a form of comparison or as a form of competition. For example, seeing that every app had points of interest, was a way in which we matched up well and that encouraged us to go forward with it. In contrast, seeing that no one else had 'Food Locations' or 'Expert Finders' excited us because we saw an opportunity to stand out from what had already been done.
I believe the deciding factor here, between compare and compete, was the previous research we had done with stakeholders. We were anchored by what they had told us about their experiences with tours and it helped us understand how we wanted to be, relative to similar applications.
MVP Features List
At this point, we knew the most amount of value would derive from being intentional with our POIs/highlights, making the authentication of accounts secure, giving tours personalization, and allowing freedom within the tour to explore those POIs. With this information, we concluded our MVP Features list.
Fill out interest survey
Points of Interest
AR overlay info and videos on buildings
Follow a path on the map
Configure notification/location settings
Recieve food notifications
With a confirmation from the customer on the features that were being implemented, we focused our attention on how these features would be seen on the UI and how this logic made the most sense. This was when I created a logic diagram and we went through a few phases of iterating the logic flow, presenting it to the customer, syncing internally with the team, adjusting the logic flow, and repeating until we settled on a final version of the logic.
The logic diagram became very messy and was translated into four different iterations of a prototype that made very little sense. From those messy four prototypes, a final paper prototype was born. This paper prototype was mostly used by me to create the Figma mockup. The paper prototype was never put in front of stakeholders which was a missed opportunity. I do believe we were lucky that they accepted our solution but in the future, I wouldn't go to high fidelity screens as quickly without checking in with stakeholders.
Having agreed internally on how we wanted to implement the features, we were worried about misaligning with stakeholders since they had yet to see the paper prototype and we were already developing screens. We found it important to turn the paper prototype into a low-fidelity wireframe to not only communicate our ideas but also to excite the customer about those ideas since it had been a couple of months at this point and no screens had been shown.
One unfortunate aspect that I wish I would have advocated for is that we did not get these wires in front of any of the students or teachers despite the Figma files being fully hooked up and interactive. While it was great to have some design decisions get complimented by supervisors, there was a missed opportunity at this point to see what end-users thought.
Micro Design System
Once features were solidified, and the logic of the screens was agreed on, it became time to stylize and design the screens. This was aided by the use of a design system template I have made for other projects. My design system follows an atomic structure, where the first row is the ‘atom’. The second row is the ‘molecule’. The third row is the ‘organs’. And the fourth row is an ‘organism’. When implementing new styles for different applications, I start at the atomic level. I also try to only begin design once a large portion of the UX research has been done. This is a very ideal situation and in most projects, is hard to do as business requirements can change and developers need screens ASAP. In this case, we were able to thoroughly investigate the users before designing.
With all of the current research and understanding of the problem, we conducted our MVP prototype. The first embedded Figma file is the MVP mobile app that was discussed throughout this page. The second file is the admin page that accompanies the app, which I also designed, but did not do documented research on.
Outcomes and Next Steps
With the final MVP prototype completed, development is underway and in the hands of some great developers. The current app is still in its previous version (BEAM) and can be found here. For future updates, in addition to the current overhaul being done, we listed out some potential other features that might be valuable based on our user research and what we learned.
An AR component to promote engagement and growth over time. Specifically the idea of planting digital seeds that you can return to the following year and see them grow when pointing your camera at the same location you planted them.
Expert Finder: this could connect students with professors in their field of interest or even alumni from their high school which aligns with Gear Up’s goal of helping students "picture themselves" as a student.
An in-game currency to promote gamification and engagement from students.
Tour ‘receipts’ or histories for monitoring. This was something de-prioritized based on the teacher’s lack of expected need of them. User testing after a few months could reveal this as a needed feature.
Improve accessibility if a student without access to sound or sight wanted to tour. Ex: have an audio narrated tour.
Events (for food): Since BSU students have access and the beacons are in place, it could be a way for students touring or for BSU students to know that mass catering events have food that is being thrown away.
Admissions involvement: this is going to be an ever-evolving process as the app updates are deployed and used, needs and features are expected to arise that aid admission’s office in doing their job.