NASA-LLIS-0976
Lessons Learned – User Feedback (Operational) Testing
Year: 2001
Abstract: Description of Driving Event:
In the 1995-1996 timeframe the Office of Safety and Mission Assurance (OSMA) began development of a distance learning capability under the umbrella of the Professional Development Initiative (PDI). This distance learning capability eventually evolved from a Safety and Mission Assurance discipline system into the Site for Online Learning and Resources (SOLAR) (http://solar. msfc.nasa.gov), currently one of NASA's primary distance learning resources. The intent of the initial development effort was to design and implement a prototype system for the Safety and Mission Assurance discipline. Since web-based distance learning was an untried technology within NASA, the development team determined that before initial system release a User Feedback Test was required to exercise the system and identify problems from a user perspective. (Since this test was conducted under expected operating conditions using a portion of the expected user base it could also be considered an operational test).
Not surprisingly the test results identified a series of changes that should be made to the system before production release. These included:
• Adding acronym and test buttons to the system navigation aids frame.
• Improving the legibility of embedded forms by using Portable Document Format files
• Providing clear indications of what each hyperlink identified within the course work does (glossary vs. more course material vs. reference, etc.)
While identification of changes to features and other improvements of that type were expected outcomes of the User Feedback Test we were not prepared for one of the findings that conflicted with one of the primary assumptions we'd made as we initiated design. We had assumed that even though the Internet and the world-wide-web were relatively new, most of the NASA workforce would have enough familiarity and capability to use the web-based distance learning capability without additional assistance. The test results indicated otherwise, we had a wide spread of web user capability from experienced to novice and if we wanted the web-based distance learning capability to succeed on a wide basis within NASA additional support would be required. (As a note while most new users/potential users now have enough basic internet/web skills to navigate the SOLAR web-site we still experience some difficulties when users are required to perform non-routine web activities, for example downloading and integrating plug-ins required for system use. Therefore the introduction of new capabilities that require non-routine web activities needs to be well thought out and should include arrangements for support of users.)
In response to the results of the User Feedback Test an upgrade to the system was provided that included a more extensive tutorial "Web-Workshop" to the user help features. This "Web-Workshop" was oriented to the novice web user and included topics such as Browser Basics, Screen Resolution, Plug-ins and Helper Applications as well as an Explanation of System Features. Examination of the User Feedback Test results also was a factor in including hands-on system demonstrations as a part of the system roll out and marketing campaign.
A secondary outcome from the test was the identification that the pace of individual users taking the training varied much more than we had originally anticipated. Our original plan was to identify a "suggested" time to complete each module based upon an "average" user; however, after noting the wide variation in the time required to complete modules we decided to play-down the "suggested" times so as not to stigmatize slower self-paced learners.
In the 1995-1996 timeframe the Office of Safety and Mission Assurance (OSMA) began development of a distance learning capability under the umbrella of the Professional Development Initiative (PDI). This distance learning capability eventually evolved from a Safety and Mission Assurance discipline system into the Site for Online Learning and Resources (SOLAR) (http://solar. msfc.nasa.gov), currently one of NASA's primary distance learning resources. The intent of the initial development effort was to design and implement a prototype system for the Safety and Mission Assurance discipline. Since web-based distance learning was an untried technology within NASA, the development team determined that before initial system release a User Feedback Test was required to exercise the system and identify problems from a user perspective. (Since this test was conducted under expected operating conditions using a portion of the expected user base it could also be considered an operational test).
Not surprisingly the test results identified a series of changes that should be made to the system before production release. These included:
• Adding acronym and test buttons to the system navigation aids frame.
• Improving the legibility of embedded forms by using Portable Document Format files
• Providing clear indications of what each hyperlink identified within the course work does (glossary vs. more course material vs. reference, etc.)
While identification of changes to features and other improvements of that type were expected outcomes of the User Feedback Test we were not prepared for one of the findings that conflicted with one of the primary assumptions we'd made as we initiated design. We had assumed that even though the Internet and the world-wide-web were relatively new, most of the NASA workforce would have enough familiarity and capability to use the web-based distance learning capability without additional assistance. The test results indicated otherwise, we had a wide spread of web user capability from experienced to novice and if we wanted the web-based distance learning capability to succeed on a wide basis within NASA additional support would be required. (As a note while most new users/potential users now have enough basic internet/web skills to navigate the SOLAR web-site we still experience some difficulties when users are required to perform non-routine web activities, for example downloading and integrating plug-ins required for system use. Therefore the introduction of new capabilities that require non-routine web activities needs to be well thought out and should include arrangements for support of users.)
In response to the results of the User Feedback Test an upgrade to the system was provided that included a more extensive tutorial "Web-Workshop" to the user help features. This "Web-Workshop" was oriented to the novice web user and included topics such as Browser Basics, Screen Resolution, Plug-ins and Helper Applications as well as an Explanation of System Features. Examination of the User Feedback Test results also was a factor in including hands-on system demonstrations as a part of the system roll out and marketing campaign.
A secondary outcome from the test was the identification that the pace of individual users taking the training varied much more than we had originally anticipated. Our original plan was to identify a "suggested" time to complete each module based upon an "average" user; however, after noting the wide variation in the time required to complete modules we decided to play-down the "suggested" times so as not to stigmatize slower self-paced learners.
Subject: Information Technology/Systems
Show full item record
| contributor author | NASA - National Aeronautics and Space Administration (NASA) | |
| date accessioned | 2017-09-04T18:17:42Z | |
| date available | 2017-09-04T18:17:42Z | |
| date copyright | 07/26/2001 | |
| date issued | 2001 | |
| identifier other | HPUEQCAAAAAAAAAA.pdf | |
| identifier uri | http://yse.yabesh.ir/std;jsery=autho162s7D8308/handle/yse/200495 | |
| description abstract | Description of Driving Event: In the 1995-1996 timeframe the Office of Safety and Mission Assurance (OSMA) began development of a distance learning capability under the umbrella of the Professional Development Initiative (PDI). This distance learning capability eventually evolved from a Safety and Mission Assurance discipline system into the Site for Online Learning and Resources (SOLAR) (http://solar. msfc.nasa.gov), currently one of NASA's primary distance learning resources. The intent of the initial development effort was to design and implement a prototype system for the Safety and Mission Assurance discipline. Since web-based distance learning was an untried technology within NASA, the development team determined that before initial system release a User Feedback Test was required to exercise the system and identify problems from a user perspective. (Since this test was conducted under expected operating conditions using a portion of the expected user base it could also be considered an operational test). Not surprisingly the test results identified a series of changes that should be made to the system before production release. These included: • Adding acronym and test buttons to the system navigation aids frame. • Improving the legibility of embedded forms by using Portable Document Format files • Providing clear indications of what each hyperlink identified within the course work does (glossary vs. more course material vs. reference, etc.) While identification of changes to features and other improvements of that type were expected outcomes of the User Feedback Test we were not prepared for one of the findings that conflicted with one of the primary assumptions we'd made as we initiated design. We had assumed that even though the Internet and the world-wide-web were relatively new, most of the NASA workforce would have enough familiarity and capability to use the web-based distance learning capability without additional assistance. The test results indicated otherwise, we had a wide spread of web user capability from experienced to novice and if we wanted the web-based distance learning capability to succeed on a wide basis within NASA additional support would be required. (As a note while most new users/potential users now have enough basic internet/web skills to navigate the SOLAR web-site we still experience some difficulties when users are required to perform non-routine web activities, for example downloading and integrating plug-ins required for system use. Therefore the introduction of new capabilities that require non-routine web activities needs to be well thought out and should include arrangements for support of users.) In response to the results of the User Feedback Test an upgrade to the system was provided that included a more extensive tutorial "Web-Workshop" to the user help features. This "Web-Workshop" was oriented to the novice web user and included topics such as Browser Basics, Screen Resolution, Plug-ins and Helper Applications as well as an Explanation of System Features. Examination of the User Feedback Test results also was a factor in including hands-on system demonstrations as a part of the system roll out and marketing campaign. A secondary outcome from the test was the identification that the pace of individual users taking the training varied much more than we had originally anticipated. Our original plan was to identify a "suggested" time to complete each module based upon an "average" user; however, after noting the wide variation in the time required to complete modules we decided to play-down the "suggested" times so as not to stigmatize slower self-paced learners. | |
| language | English | |
| title | NASA-LLIS-0976 | num |
| title | Lessons Learned – User Feedback (Operational) Testing | en |
| type | standard | |
| page | 3 | |
| status | Active | |
| tree | NASA - National Aeronautics and Space Administration (NASA):;2001 | |
| contenttype | fulltext | |
| subject keywords | Information Technology/Systems | |
| subject keywords | Policy & Planning | |
| subject keywords | Test & Verification | |
| subject keywords | Training Equipment |

درباره ما