Special Edition:
SAP User Experience

back Edition Overview

Leading Article

New SAP UX Projects

User-Centered Design

Design and Visual Design

Accessibility

More Project Reports

Events

For Your Reference

 

 

User-Centered Design Success Story: SAP Analytics Customer Validation Project

By Ulrike Weissenberger, SAP User Experience, SAP AG – August 30, 2006

Abstract

The task given to the project team was to validate a subset of the 116 version 1.1 Analytic Applications with end users. These analytic applications were built as demos for SAPPHIRE 2005 from various industries. The project team was a global team of 31 people (5 from the User Experience (UX) group in the US and Germany, 2 from the Analytic Applications group in Germany, and 24 Solution Managers from Germany, the US and India).

The project turned out to be the largest customer validation project ever executed at SAP. It lasted from mid October, 2005 through mid May, 2006. After a planning and set-up phase, the project started mid December, 2005 with a project-kickoff- meeting and completed in May, 2006 with the release of 111 versions 1.1 Analytic Apps in the Marketplace. The validation activities were supported by an external vendor who was assigned to recruit the end users and run the test sessions.

Facts and Data

  • Version 1.1. Analytic Applications created with Visual Composer
  • Specific industry analytic applications from:
    • Automotive
    • Chemical
    • CP & LS
    • Healthcare
    • Hightech
    • Public Sector
    • Service Provider
    • Utilities
  • Cross-industry analytic applications from:
    • CRM
    • FIN
    • HCM
    • Manufacturing
    • SCM
    • SRM
  • Mid Oct, 2005 – Mid May, 2006
  • 133 test sessions conducted
  • 111 revised UIs
  • 79,5% identified UI issues fixed

 

Scope

During the project planning phase it was determined that 25 Analytic Applications should be validated with end users. Major project goals were:

  • Validate requirements and goals of the selected Analytic Applications with end-users to determine effectiveness, efficiency, and ease of use
  • Collect metrics such as rates of task completion, failures, and required assists
  • Identify global UI issues and update UI guidelines for Analytic Applications
  • Identify and incorporate UI and content changes for topics tested
  • And last but not least: train Solution Managers how to create the project documents that were necessary to follow the SAP User-Centered Design process

 

Overview of User-Centered Design (UCD) Process

The UCD process consists of three phases: Understand Users, Define Interaction and Design UI. The goals of these phases are to:

  1. Understand Users. Conduct user research activities to understand the end-users' needs. Product development not based on users' needs cannot succeed. Talking to end-users can uncover specific requirements for existing software, ideas for new products, and even inspire revolutionary innovation.
  2. Define Interaction. Synthesize user requirements into interaction use cases that define a product's usage from a user's perspective. The use cases can then guide the overall product development requirements, as well as set the foundation for user validation activities.
  3. Design UI. Transform the interaction use cases into UI designs. This phase begins with fast, low-fidelity prototypes and ends with interactive, high-fidelity prototypes, incorporating feedback from user validations throughout.

 

How UCD Worked for the Project

Due to the fact that this project mainly dealt with the validation of already developed Analytic Applications and a tight project schedule, unfortunately the project team decided it had to skip Phase 1 of the UCD process (Understand Users) this time. The team started immediately after the project kick-off-meeting with Phase 2 of the process: Define Interaction by writing use cases.

Solution Managers were trained to write use cases based on the information previously gathered. Although this time no real user research was performed, they wrote the use cases by summarizing their experiences and knowledge of what end users need to do when using Analytic Applications. They also created test participant profiles (TPPs) to ensure that the vendor recruited users who matched the target user profiles. After several review sessions, the TPPs were handed over to the external vendor for recruiting.

After these preparation steps were completed, Solution Managers wrote test scripts for the validation sessions, guided by the User Experience group. The actual validation sessions were conducted by the vendor--mainly from February through March, 2006. During all validation sessions it was possible for Solution Managers to participate remotely via WebEx.

The Solution Managers were provided recordings of all participants after the validation sessions. The User Experience group held validation analysis meetings with the Solution Managers to go through the results of the test sessions provided by the vendor. Based on these insights, phase 3 of UCD could start, i.e., design the UIs (or re-design the UIs).

The User Interface Designer didn’t start with a design from scratch, but took the existing design and improved/changed it based on the feedback received during the validation sessions. The Analytics Standards and Guidelines document was updated based on the data and sent to all Solution Managers. After Solution Managers implemented the changes, the User Experience group conducted UI reviews to ensure consistency of the UI designs and adherence to UI standards.. The insights from the validation sessions are now included in the new Analytics UI guidelines and will be considered from the beginning in the next release of Analytic Applications.

The following figure shows an overview of the project timeline and the milestones.

Figure 1: Project milestones and schedule

 

What Has Changed?

The top usability issues that were detected by running the test sessions were:

  • Master-Detail Relationship
  • Selection/Scoping View
  • Value Help
  • Validity of Data
  • Screen Size
  • Content Organization
  • Drilldown to More Information
  • Column Sorting
  • Resizing Table Columns
  • Application Information Window (Error Messages)
  • Date Picker: Month and Year Selection

Some examples of what has changed after the project team received end-user feedback during the validation sessions include the following:

Example 1

"Old" design: end users had problems identifying the Selection area, understanding the meaning of the buttons, and the time stamp:

Old screen design

Figure 2: Old screen design (click image for larger version)

With the "new" design the layout of the selection area was changed, and the buttons and the time stamp were renamed:

New screen design

Figure 3: New design (click image for larger version)

Several customers also said that there were too many values to be displayed in a pie-chart:

Too many values to be displayed in a pie-chart

Figure 4: Too many values to be displayed in a pie-chart (click image for larger version)

Therefore the model was changed. An end user could now choose between a table view and a chart view:

Changed model – an end user could now choose between a table view and a chart view

Figure 5: Changed model – an end user could now choose between a table view and a chart view (click image for larger version)

Example 2

In the "old" design a lot of information was provided to the end user and it was difficult to stay oriented on the screen:

Old screen design

Figure 6: Old design (click image for larger version)

After applying the new selection view and adding tabs to the Analytic Application, it is easier for a user to visually categorize the presented information:

New selection view and added tabs

Figure 7: New selection view and added tabs (click image for larger version)

Master detail view

Figure 8 : Master detail view (click image for larger version)

 

Rating of UCD and Conclusion

The Solution Managers were interviewed about their experiences and asked to give some remarks and recommendations for improvements. The following quotations provide an impression of the different opinions.

"That was an excellent session, I learned a lot here. Thanks for keeping me in the loop!"

"The methodology of the customer validation project and working together with a vendor for customer validation is the right way to get beneficially and qualified feedback from customer side!"

"Reports and results items in Excel sheets are much more specific and detailed than expected!"

"Our feedback is in general absolutely positive. It's very helpful and beneficially to get real customer feedback with the chance to immediately correct some of the usability problems or inconsistencies in the product."

"Customer feedback brought me very much; the methodology for interviews was very good and professional."

"1. The validation feedback we received was extremely useful, it helped us see how the end-users were using our applications.
2. We were impressed by the level of detailed feedback provided and the valuable information received.
3. The third-party perspective helped us understand more about how we intended the users to use our applications versus how they used the applications. This feedback help us understand customer and end-users more and provides us with insight to develop more customer centric applications."

"About the process of AA validation I really liked the work group and the way they validate the UI. I like the effort that was put on recording the users using the AA and also the feedback that we got from them. That was great."

The overall opinion was that the User-Centered Design process and the support of the User Experience group helped tremendously to accomplish the project successfully and to stick to the release deadline. The work was perceived as structured and comprehensible.

The validation project achieved all project goals within the allotted project schedule and the Analytic Applications were delivered on time. The validation session identified both usability problems and functional requirements. And finally, the most important experience was to raise awareness that end-user testing and involvement throughout the design process is necessary. Some Solution Managers learned that observing a real end user working with the application is different from talking in theory about a product or providing a demo.

 

To top top