Designing credit risk review portal

It helps banks assess borrowers’ creditworthiness more effectively by providing a platform for detailed reviews and assessments. With advanced analytical tools, it generates comprehensive reports on credit risk, evaluates financial indicators, and analyses credit history.
Client
Credit Suisse
My Role
UX Designer
Duration
6 Months
Team
1 Project Manager and 2 UX Designer

Project Overview

There is a need and regulatory requirement to bring in a systematic and seamless way for the CS Risk organization to interact on the CRR process.
Function of Credit Risk Review (CRR)
Provide an independent evaluation of the effectiveness of credit risk management processes and to identify emerging risks impacting the quality of those processes.

Drivers for change

The current Credit Risk review process is manual, time intensive, and primarily accomplished through excel files and email transactions.
In most of the cases the information,reports and updates around the process is not readily available to the stakeholders.
Due to the manual process the Credit risk review process takes anywhere between 3-5 weeks and very many transactions.

Success Criteria

Experience objectives:
  • Reduce the time it takes to reference historical data
  • Make the Reviewable Entities audit, reconciliation view easy
  • Reduce the time that someone is in the process
  • Give users meaningful insights.
Data integration:
  • Integrate data into their workflow
Permissions and Accessibility:
  • One place, with the right permissions and access
Project Goals:
  • Learn new insights about the review process
  • Defined user needs that are critical and important
  • User research to frame and shape the workshop
  • Make users feel engaged in the review process
  • Make the review process easier

Research

Objective

To understand the Credit Risk Review process, the users and stakeholders involved across different locations and the methods used to conduct the risk assessment.
Research planning
Remote interviews (5 users, 5 stakeholders)
Artefacts walkthrough
BRD Analysis
Workpaper heuristics study

Artefacts consumed

To achieve the research objective, the CRR process was understood through user interviews across different geographies, after which the general process and journey was mapped.
Artefacts walkthrough gave an insight on
  • The processes and systems that are currently used by the review officer
  • Pain points in the current process and opportunity areas as an outcome
  • Information the new tool should contain, prioritise and scope for automation

Research Synthesis

Lifecycle of a review
We collaborated with various personas to clearly define the review lifecycle, detailing each key step and specifying the unique role of each persona at every stage. This approach ensured a well-understood and effective review process.

Journey Map

Cedit Risk Review (CRR) is a global, highly skilled function of the client. Their main goal is to review 100% of the credit risk exposure to the company, and report consolidated findings directly to the Board of Directors. This journey map gives a 1000ft overview of how the CRR Function performs it’s reviews and the different factors involved in the process.

Brainstorming

During the brainstorming phase, we led the team to generate a wide range of ideas to address the design challenge. Using techniques like sketching and group discussions, we explored diverse solutions and identified the promising concepts.

User Persona

With insights from our literature review, user research, and empathy mapping, we now had a better grasp of the problem. Creating user personas allowed us to simulate real-life users and propose solutions that address their specific needs.

Reviewer in Charge

Carrol August
Reviewer in Charge
BIO
The Reviewer in Charge (RiC) is primarily responsible for planning, setting up, stakeholder management, facilitating a review and collating all findings that are a part of the review process. They are held accountable for the overall output to the management team.

“ My job is to be able to step back and identify process/gaps… Where is a best practice? And try to bring that feedback to make sure that thingscould be improved ”

GOALS
  • Facilitate a comprehensive process and complete accuracy.
  • Provide the risk committee an unbiased view of risk management
  • Identify potential improvements in the Risk Management processes.
  • Create actionable recommendations to help the company mitigate risk potential.
PAIN POINTS
  • Struggles to get the data at the right place that leads to frustration
  • Manual slicing and dicing of the portfolio from source data to create Pre-review memo
  • Inconsistencies across the process with data due to slightly different way of documentation done by teams.
NEEDS
  • Data in single view, in one place so I can slice and dice the portfolio on different attributes, regions, exposures.
  • Efficient way to gather inputs from reviewers and present the findings to management.
  • Validation of past recommendations
  • Already processed data by the 1st and 2nd line of defences.

The Reviewer(s)

Tatum River
The Reviewer(s)
BIO
Reviewers are responsible for assessing assigned CP’s by gathering artefacts and communicating with various team for clarifications. They are held accountable for completing the work paper and analysis, therefore submitting it to the RiC for review.

“ There's a whole process now, everything done through excel and sharepoint, no system to go to the full complete workflow. We create a lot of folders on sharepoint, a lot of excel files to write up the report.”

GOALS
  • Spend more time on how to analyze the process and portfolio. Less time around questions of data gathering and clarifications.
  • Stop unnecessary documentation of the workflow if a tool can do that.
  • Be more effective and efficient in how we gather information to analyse.
  • Communicate collectively and effectively with other reviewers and teams.
PAIN POINTS
  • Dependency on multiple people and systems to get accurate files to review.
  • A lot of manual work to get the data we need, copy and paste and then interpret it.
  • Manual tracking and tracing for data, approvals, comments through out the work paper process
NEEDS
  • Transparency in terms of sourcing, aggregation, imports so that it can be interpreted.
  • Wants templates that automatically collates the CP details from various sources into a single file work paper file.
  • Data integrity in terms of accurate and complete info. and at the right place.

Initial Findings

  1. Finding the right data and information is difficult
  2. Long and manual forms and a lot of copy pasting
  3. Configuration of the Workpaper is completely manual
  4. Manual consolidation of data is inefficient and prone to error

Problems to solve

To understand the Credit Risk Review process, the users and stakeholders involved across different locations and the methods used to conduct the risk assessment.
Manual process
The current Credit Risk review process is manual, time intensive, and primarily accomplished through excel files and email transactions
Distributed silos
50-60 associates distributed across the US, Zurich and APAC
Information, reports and updates around the process are not readily available to the stakeholders
Oversight
Credit risk review process takes anywhere between 3-5 weeks.
The team audits another team which is responsible for reporting on Credit Suisse associate’s activity

Opportunity areas

Review Planning
  • System driven collaboration, ability to assign tasks and follow up via system. Reduce email back and forth.
  • Built in, automated approval workflows
  • Digital review tracking, automated scheduling
  • Automate/Augment sample creation
Fieldwork
  • Modular, customisable work papers
  • User-friendly change logs and change history
  • Tool driven file management, move away from shared drive
Report Writing
  • Live document to consolidate findings
  • Inbuilt communication channel/feature to increase the visibility and reduce efforts for tracking
  • Flexible and automated report generation with ability to change as per requirement
Assessment and Voting
  • Ability to capture minutes of meeting for audit trail
  • Enable a more audit friendly process
  • Ability to summarize the report visually
Publish and Close
  • Tool driven file management and audit trail

Experience Design Solution

UX Principles

Establish workflow insights
Maintaining audit trail to generate insights for users and business to make informed decisions.

Ability to trace artefacts, reports, analysis and recommendation across the review cycle.

Keeping track of what's going on in the review process and show what's happened to increase efficiency.
Manage sensitive information
Ensuring security of sensitive data by providing users with contextual access to information.

Secure and usable interface by using authenticators, micro-copy encouraging security and accountability.

Verifying the owners and users.
Create flexible frameworks
Ability to update steps or change processes due to regulatory needs.

Allows users to start with a baseline framework of review process and provide ability to add or skip the elements as per specific review requirements.
Coordinate clear action for users
Give users clear, actionable instructions and active guidance on what to do.

Provide constant feedback and persistent navigation.

Establishing consistency across CRR process to build trust, efficient use of the tool and strengthen familiarity.

Key Screens

Dynamic review dashboard for the RiC

Depending on which stage the review is at, the RiC is presented contextual information to help them manage and track their reviews effectively.

Power to the RiC to customise review checklist

Review checklist can be easily customised and translated across multiple work papers with a click of a button.

Automatic collation of review findings

Findings from multiple work papers are automatically collated for the RiC, so that they can focus on analysing and synthesising, not copy-pasting.

Quick analysis of ratings for drafting the review report

A RiC needs to complete the the assessment criteria because they need to document the review rationale and ratings that will be shared in the final report. Today, the RiC interprets disparate information shared from multiple reviewers.

Auto-fetch information for report writing

Review recommendations, portfolio information, executive summary and report narrative drafted in the tool and exported to a consistent template

Process control

Timeline and summarised view of the artefacts and process utilising the RiC checklist as a backbone

Usability Testing

Prioritise
Identify features/screens that were part of the MVP
Prototype
Create high - fidelity click through prototype in invision
Test
Test the prototypes with users, capture feedback and finalise the screens

Prioritise

We experimented with a lot of different ways to prioritise effectively with our stakeholders.

Prototype

Storyboards
We first created storyboards to help us translate ideas into low fidelity concepts.
Wireframes
Storyboards were then aligned upon, and translated into wireframes. Which were also used for tests as well as stimuli to know technical limitations/constraints.
Designs Concepts
Alignment with client’s design team for design system and language to be used for the product. So the product works and feels familiar to the current users.

Test

Identify what designs to test
Reviewing the Product Roadmap and the Journey Map helps us to determine next significant set of features to test and learn about.
Create a prototype
We pick a set of user actions to design at a high fidelity in an interactive prototype. The goal is to share a “believable simulation of a product” with the users.
Identify questions to answer
We phrase these as Yes or No questions so that we can measure them while observing the interviews.
Observe and capture data
While observing the team each individually records a response to the questions we had agreed on. Each column is a participant and each row is a response to the prototype.
Combine and synthesise
The results are combined into a visual representation of the results along with the observations and feedback into a research summary.
Follow up to get it right.
Add the updates that are confirmed to the designs. Ideally repeat a test on the same Epic with further detail. We repeat in cycles throughout the development process.

Impact

  • The tool minimised the time spent on reviews by reducing manual efforts, including file handling, email correspondence, and physical voting.
  • The tool streamlined processes across different regions, reducing errors and manual effort. It ensured consistent procedures globally while allowing for regional customisation.
  • The tool enhanced visibility and thoroughness during audits by documenting the entire review process. Every comment and decision was logged, allowing for precise traceability during audits.

Learnings

  • Rigorous testing and iterations are crucial. Validating designs with end users is immensely rewarding, offering deep insights into human behaviour. Each testing session reveals valuable learning, guiding improvements. Iterating based on feedback was motivating and allowed me to refine my assumptions as a designer, providing a humbling experience.
  • Remote collaboration taught me how to conduct workshops, interviews, and testing virtually while navigating time zone differences. I effectively communicated with teammates and users across various locations, including Bangalore, Hyderabad, Dallas, New York, Switzerland, and London.
  • Establishing rituals for weekly planning and retrospectives proves invaluable. Documenting decisions and tasks ensures easy reference to past work and helps streamline ongoing efforts.