CSR3
Product Design
0  /  100
keyboard_arrow_up
keyboard_arrow_down
keyboard_arrow_left
keyboard_arrow_right
CSR3 / Product Design

Simplifying the administration of players, rewards, penalties, and promotional campaigns for a car game that led to an SUS score change from 42,5 to 95

Start Reading

CSR3 / Product Design

Overview

Improvements for the creation and management of complex campaigns, player management, and optimization of result visualization

CSR3 was the latest version of the car game CSR2 with over 50 million players worldwide, and they needed a management system for both the players and the prizes they would receive depending on many variables. Additionally, campaigns would be generated to sell items, upgrades, and cars at promotional prices.

Additionally, the system should handle more technical aspects related to the video game such as environment performance, shutdown or restart of environments, and perform queries of system user activities with an audit log.

To comply with my non-disclosure agreement, I have omitted and obfuscated confidential information in this case study.
All information in this case study is my own and does not necessarily reflect the views of CSR3 or any other related company.

The business goals

Speed up, reduce, optimize.

Although it covered various aspects related to the video game, the most important points for the client were:

  1. Minimize complexity when creating promotions: There are many factors that determine when one promotion or another would be applied to certain groups of players, such as location, time zone, player score, purchase history, types of items, A-B testing, etc.
  2. Reduce the error rate when creating promotions or items or applying penalties to players: Since there is a lot of parameterization, it was necessary to have ways to prevent or minimize human error within the system.
  3. Optimize response times: Regarding game errors, server failures, and reports of player abuse.

The client knew exactly what they wanted; now it was only necessary to validate that the business goals were aligned with the needs of the users.

Users & audience

Administrators and players

In this case, we had two types of profiles: administrators with different levels of permissions to create, edit, delete, and oversee within the system, and players to whom decisions, rewards, gifts, promotions, and penalties generated in the system were to be applied.

  1. Administrators: According to different profiles and permissions, they could create, add, edit, and delete promotions, campaigns, game items. Some administrators were exclusively dedicated to managing interactions between users such as chats, teams, abuse reports, while other administrators only had access to the service management area to shut down environments, check system behaviors, and performance.
  2. Players: Even though they were not going to use the game management system, they were going to be the recipients of all the decisions made in the system. That's why it was necessary to have the perspective of the players when receiving rewards, offers, penalties, and new game items.

Team, role & responsabilities

1 Product designer / 1 Junior UX Designer / 1 PO / 1 BA / 1 TL & +10 Devs

It was a small group; my daily interaction was mostly with the BA, the PO, and the TL. In addition to the daily meetings and other ceremonies of the agile methodologies, I also had meetings whenever necessary with the TL and the Devs to discuss solutions, generate technical documentation, and their implementation in the final system.

  1. My role: As a Product Designer, my role was to ensure that the system met the needs and expectations of both users and the business, contribute to a much broader analysis of issues related to processes, and provide a strategic vision of the business focused on users, competition, and long-term growth.
  2. My responsabilities:
    • Transform research insights into product improvements.
    • Translate broad, conceptual ideas and documentation into practical and valuable solutions for our users.
    • Drive the development of intuitive, user-centric designs for the system, ensuring a seamless and engaging user experience.
    • Clarify complex user requirements and conduct user research to inform design decisions.
    • Create both low-fidelity wireframes for concept exploration and high-fidelity designs for detailed visual representations, covering all stages of the design process.
    • Oversee project timelines and deliver high-quality design solutions within established deadlines.
    • Collaborate closely with a multidisciplinary team, including developers, business analysts, product owners, and team leaders, ensuring awareness of development feasibility and complexity when proposing design solutions based on developer input.
    • Create the Design System and integrate interaction design components into the overall design system, fostering consistency and scalability throughout the system.
    • Manage the handoff to developers with high-fidelity design assets, providing them with the resources needed for accurate implementation. Lead the end-to-end product design process.
  3. Team work methodology: The working methodology in this project was quite close-knit and direct. In addition to Scrum ceremonies, we utilized Jira and Confluence for documentation. We often held frequent, short meetings of 15 minutes or less where we clarified doubts or discussed ideas, sometimes with just one individual: the Product Owner (PO), Business Analyst (BA), Team Lead (TL), or a Developer.

    On other occasions, multiple individuals were involved, and if discussions required more time, additional meetings were scheduled for further deliberation. Some queries were resolved through Slack or emails.

The challenge

Simplify the parameterization and segmentation of actions dependent on multiple variables

The most challenging aspect of this project was finding ways to make processes involving numerous variables, such as creating promotions and applying highly complex rules to groups of players, more user-friendly and straightforward while meeting very specific conditions and criteria.

Factors to consider

50 Million players

Despite being a game that had not yet been launched, the expectations were to have the same number of users as its predecessor, CSR2, which at that time surpassed 50 million players. This posed a significant challenge due to the diverse profiles and needs of the users.

Multiple time zones and regions

The 50 million players were spread around the world, which necessitated promotions to be parameterized according to the location of different groups of players and launched based on their time zones.

Player peculiarities

Each player manages a fairly extensive set of information, including, among others: global ranking, regional ranking, country ranking, RP points, races won, activity time, idle time, created teams, abuse reports, status, purchases, item inventory, cars, upgrades, money earned, money spent, gifts, chat behavior, and other game-related values.

Different currencies and local currency value relative to the US dollar

The difference in currencies and the inflationary situations in each country were factors to consider when creating promotions or items to be sold in different countries with varying economic conditions, to avoid excluding the game from markets in countries with less favorable economic conditions.

Process

Step 1 - Define User Personas

The first step is always to understand our users, classify them into groups, and define the needs and pain points of each group. To achieve this, I initially obtained information from stakeholders regarding the types of users who would use the system and the most frequent complaints.

Only qualitative research methods were used because we didn't have measurement tools within the current system. Google Analytics, Hotjar, or any other measurement tool was never installed.

Step 2: Send the survey to the users

In this project, the users were very busy individuals with tight schedules, so I initially decided to send them a survey to gauge their needs and complaints. The good thing about surveys is that they can be done at any time the user can spare 5-10 minutes to answer the questions without hindering their workflow. In this case, I decided to use the System Usability Scale (SUS), which consists of a 10-item questionnaire designed to collect users' subjective perceptions of various aspects of usability, such as ease of use, learnability, efficiency, and satisfaction. The questionnaire typically uses a 5-point Likert scale, where users rate their level of agreement with each statement. The responses from the SUS questionnaire are then aggregated and converted into a single usability score (See SUS calculator), ranging from 0 to 100. Being below 68 on this scale indicates falling below average.

Step 3: User interviews & contextual research

At the same time as I sent out the survey to the users, I defined with the stakeholders groups of users whom I could interview and scheduled the meetings. I created the form with the questions I would ask the user groups during the interview and defined the areas I wanted them to show me on their computer to evaluate the usability of the flows they used most frequently.

The most important thing at this stage was to understand what the key tasks were for each user group and what the most serious and common problems they faced when using the system were.

Step 4: Analysis & utilization of the results

Once the interviews were conducted and the survey responses were collected, the next step was to convert all that information into graphs that would give us a clearer visualization of the most frequent and severe problems within the system. To do this, I used tools such as customer journey maps, red route analysis, a high-level summary report, and a detailed findings report.

When sharing the information with the team, ideas arose on how to approach each case, potential problems or challenges that would need to be considered when implementing the solutions, and technical observations regarding the ideas proposed by each team member.

All proposed solutions and exposed problems were prioritized on a map to determine which ones we should tackle for the MVP, which ones would be addressed next in the following stage, and which ones would be left for subsequent phases.

Step 5: Task Flows & Lo-Fi prototype

With all the collected information and the team's observations and ideas, I designed the low-fidelity flows and screens. Before drawing the first line within the wireframes, I always work on task flows; I want to understand first what steps the user needs to take in each task before defining how the screens would be, optimizing interaction with the system, and deciding which would be the best components to use in each case.

As I progressed with the wireframes for each flow, I validated them with the Product Owner (PO), Business Analyst (BA), Team Lead (TL), and Developers in a low-fidelity clickable prototype to ensure that everything proposed was feasible and within the estimated timelines.

Step 6: User testing & iterations

Once the prototype was approved by the stakeholders, it was time to conduct a user test to validate that the flows met their expectations and addressed their pain points, in addition to gathering feedback on adjustments or improvements that they might come up with while testing the low-fidelity prototype.

Once the user tests were conducted, feedback was collected, and validated again with the stakeholders, I proceeded to make the necessary adjustments and some improvements that were suggested by the users during the interviews.

Step 7: UI design proposals + A/B Testing

With all the flows reviewed and approved, I proceeded to design the interface. The idea was to use a sober and minimalist layout, where the user's attention was well directed towards the most important data or tasks.

Given the amount of information and functionalities involved, it was crucial that the design did not become a distracting factor within the system. The UI design needed to facilitate the reading and prioritization of elements within the screens.

I designed two proposals and reviewed them with the stakeholders to gather feedback. Typically, I create two design proposals and apply them to the homepage and some internal pages to conduct an A/B test with the stakeholders (and if time allows, with some users) to decide on one of the two proposals.

Step 8: Creation of the design system, management of transferring high-fidelity design resources to developers & support in implementation

Once the final design was approved, I created the design system following Material Design guidelines. In this case, I had to design two sets of layouts since both a light and a dark version of the system were to be developed. After creating the system, I provided support to developers in creating the Storybook to validate that the React design aligned with the design I had provided. Once implemented, I also validated the correct application of the design on the system screens.

Step 9: New survey and user testing

With the testing environment operational in the most critical areas, namely campaign creation (awards and promotions) and user incident management (penalties and punishments), I conducted another usability analysis with users and sent them the SUS survey to verify any improvement. The survey results showed a system rating increase from 42.5 to 95. Despite being a partial validation, it constituted a very positive assessment and a significant increase in user satisfaction.

Upon repeating the SUS test, the rating had increased from 42.5 (well below average) to 95, which was a highly positive result and demonstrated the effectiveness of the changes.

Outcomes & results

Simplified workflows of the most important tasks for users and the business.

One of the most frequent pain points from the interviews was the number of steps and configurations necessary to create campaigns and deliver rewards or assign penalties to players. There were too many steps and too much information with little or no system assistance to facilitate these processes, leading to cognitive overload for the user. This primarily resulted in having to invest a lot of time and effort in creating campaigns, in addition to being a problem when incorporating new individuals who needed to learn how to use the system.

On the business side, being able to facilitate these processes was aligned with the overall and specific objectives of the business, related to performance, effectiveness, and productivity.

All this simplification and improvement of processes resulted in the SUS score rising from 42.5, which is a mediocre score and well below average, to 95, which is a very good score.

Better utilization and visualization of data

The previous system contained a lot of valuable information, but to view it, one had to generate PDF reports to read them. One of the initial recommendations was to leverage all that information and be able to present it to users in a way that they could utilize it without having to generate a PDF to visualize the data.

There was a lot of valuable information that wasn't being shown to the administrators in a timely manner. One of the challenges was to take all that data and leverage it so that administrators could make more accurate and timely decisions.

Reduction of human error

To minimize human error generation, processes were designed to enable administrators to perform tasks while considering and visualizing all variables and dependencies. Additionally, steps and the amount of information shown to administrators were simplified as much as possible. Action logs per user, improved error handling, integration with the support system's help tickets, comments, and notes were implemented. Furthermore, an efficient help system was established to enable administrators to know what they were doing at all times.

Improved alert system and incident management within the system

For the business, it was important to know the status of its servers, have test servers, and be able to monitor them all. For this purpose, sections were created where administrators could check the status of each environment, countries, cities, number of players, number of teams, hours played, connection speed, ways to run tests to verify the health of the environments, and alarm systems that allowed for timely actions to be taken as quickly as possible.

What would I have liked to do differently?

It would have been very helpful to have quantitative information that we could use to support decisions, and it also seems unnecessary to me to do practically all the system flows in low fidelity (196 wireframes) we could have moved much faster making only the main flows and leaving basic screens or simple flows to be designed directly in high definition for the final prototype.