Part 1: Gathering Information
Reflecting on Myself
As both a learner and a designer, I value clarity and intentional structure in learning games and simulations. When I engage with educational games, I care less about flashy mechanics and more about whether the experience helps me understand systems, relationships, or concepts more deeply. If the mechanics feel disconnected from the learning objective, I disengage quickly.
For entertainment games, I value immersion, meaningful progression, and well-paced difficulty curves. Frustration is acceptable if it feels fair, but confusion caused by unclear objectives or poor interface design is not.
From a usability standpoint, I prioritize:
-
Clear navigation
-
Consistent visual hierarchy
-
Explicit feedback
-
Low cognitive overload
What is less important to me:
-
Overly complex menus
-
Excessive branching without guidance
-
Aesthetic polish at the expense of clarity
Reflecting on this reminded me that even when learning content is strong, usability issues can significantly disrupt engagement.
Reflecting on Others – Data Collection
Since I am focusing on a User Persona, I prioritized gathering information about how undergraduates interact with digital learning environments and complex interfaces.
I artificially limited myself to three sources, as required:
Source 1 – Research Study (Second-hand Data)
-
Study on undergraduate user experience preferences in digital learning environments (peer-reviewed journal article).
-
Focused on cognitive load, navigation clarity, and perceived usability.
Source 2 – Accessibility & UX Guidelines (Second-hand Data)
-
WCAG-based accessibility research and usability design recommendations for digital learning tools.
-
Emphasis on color contrast, text readability, and information chunking.
Source 3 – Public Player Motivation & UX Data (Second-hand Data)
-
Quantic Foundry player motivation data.
-
Used to understand expectations of digital-native users in interactive environments.
Where I Searched & Type of Information Found
All three sources were second-hand research.
I searched:
-
Academic databases (MU Library access)
-
Usability and accessibility documentation
-
Player research databases
The types of data found included:
-
Quantitative survey data (student preferences, usability ratings)
-
Reported frustration points
-
Design recommendations
-
Motivational percentages and player-type breakdowns
The audience examined:
-
Primarily undergraduate students (18–24)
-
Digital-native learners
-
Individuals accustomed to modern interface standards
Part 2: Analyzing Findings
Across the three sources, several patterns emerged:
1️⃣ Undergraduates expect intuitive navigation
Students show low tolerance for unclear instructions or hidden information. When cognitive effort is spent figuring out “what to do next,” engagement drops significantly.
2️⃣ Cognitive load must be managed carefully
Complex content (such as Dante’s Inferno) requires scaffolding. Chunked information, progressive disclosure, and visual signposting are important.
3️⃣ Accessibility is not optional
Color contrast, text legibility, and flexible pacing impact not only students with disabilities but overall usability.
4️⃣ Engagement does not override clarity
While modern users enjoy interactivity, excessive choice or visual clutter increases frustration.
Alignment with My Own Reflection
Interestingly, much of the research aligned with my own reflection as a user. I also value clarity over complexity and become disengaged when navigation interferes with learning.
However, the research highlighted something I had not fully considered: even minor usability barriers can significantly increase cognitive load, especially when the learning content is conceptually demanding.
For a Dante-based serious game, where interpretation and symbolism are already cognitively taxing, usability will play a critical role in maintaining engagement.
Part 3: Reflection on the Data Collection Process
Data collection was smoother than expected, primarily because usability research is well-documented and accessible. However, one challenge was distinguishing between “learner” and “user” data, as there is natural overlap between the two.
Another challenge was narrowing the focus. Usability research is vast, and artificially limiting the process to three sources required prioritizing relevance over breadth.
If I were to repeat this process, I would consider incorporating:
-
At least one first-hand interview with a current undergraduate student
-
Direct observation of how students navigate complex digital interfaces
Overall, this process reinforced that a well-designed serious game must not only teach effectively but also guide users clearly through the experience.
No comments:
Post a Comment