• 0
    • ارسال درخواست
    • حذف همه
    • Industrial Standards
    • Defence Standards
  • درباره ما
  • درخواست موردی
  • فهرست استانداردها
    • Industrial Standards
    • Defence Standards
  • راهنما
  • Login
  • لیست خرید شما 0
    • ارسال درخواست
    • حذف همه
View Item 
  •   YSE
  • Defence Standards
  • NASA - National Aeronautics and Space Administration (NASA)
  • View Item
  •   YSE
  • Defence Standards
  • NASA - National Aeronautics and Space Administration (NASA)
  • View Item
  • All Fields
  • Title(or Doc Num)
  • Organization
  • Year
  • Subject
Advanced Search
JavaScript is disabled for your browser. Some features of this site may not work without it.

Archive

NASA-LLIS-0976

Lessons Learned – User Feedback (Operational) Testing

Organization:
NASA - National Aeronautics and Space Administration (NASA)
Year: 2001

Abstract: Description of Driving Event:
In the 1995-1996 timeframe the Office of Safety and Mission Assurance (OSMA) began development of a distance learning capability under the umbrella of the Professional Development Initiative (PDI). This distance learning capability eventually evolved from a Safety and Mission Assurance discipline system into the Site for Online Learning and Resources (SOLAR) (http://solar. msfc.nasa.gov), currently one of NASA's primary distance learning resources. The intent of the initial development effort was to design and implement a prototype system for the Safety and Mission Assurance discipline. Since web-based distance learning was an untried technology within NASA, the development team determined that before initial system release a User Feedback Test was required to exercise the system and identify problems from a user perspective. (Since this test was conducted under expected operating conditions using a portion of the expected user base it could also be considered an operational test).
Not surprisingly the test results identified a series of changes that should be made to the system before production release. These included:
• Adding acronym and test buttons to the system navigation aids frame.
• Improving the legibility of embedded forms by using Portable Document Format files
• Providing clear indications of what each hyperlink identified within the course work does (glossary vs. more course material vs. reference, etc.)
While identification of changes to features and other improvements of that type were expected outcomes of the User Feedback Test we were not prepared for one of the findings that conflicted with one of the primary assumptions we'd made as we initiated design. We had assumed that even though the Internet and the world-wide-web were relatively new, most of the NASA workforce would have enough familiarity and capability to use the web-based distance learning capability without additional assistance. The test results indicated otherwise, we had a wide spread of web user capability from experienced to novice and if we wanted the web-based distance learning capability to succeed on a wide basis within NASA additional support would be required. (As a note while most new users/potential users now have enough basic internet/web skills to navigate the SOLAR web-site we still experience some difficulties when users are required to perform non-routine web activities, for example downloading and integrating plug-ins required for system use. Therefore the introduction of new capabilities that require non-routine web activities needs to be well thought out and should include arrangements for support of users.)
In response to the results of the User Feedback Test an upgrade to the system was provided that included a more extensive tutorial "Web-Workshop" to the user help features. This "Web-Workshop" was oriented to the novice web user and included topics such as Browser Basics, Screen Resolution, Plug-ins and Helper Applications as well as an Explanation of System Features. Examination of the User Feedback Test results also was a factor in including hands-on system demonstrations as a part of the system roll out and marketing campaign.
A secondary outcome from the test was the identification that the pace of individual users taking the training varied much more than we had originally anticipated. Our original plan was to identify a "suggested" time to complete each module based upon an "average" user; however, after noting the wide variation in the time required to complete modules we decided to play-down the "suggested" times so as not to stigmatize slower self-paced learners.
URI: http://yse.yabesh.ir/std;jsery=autho162s7D8308/handle/yse/200495
Subject: Information Technology/Systems
Collections :
  • NASA - National Aeronautics and Space Administration (NASA)
  • Download PDF : (15.55Kb)
  • Show Full MetaData Hide Full MetaData
  • Statistics

    NASA-LLIS-0976

Show full item record

contributor authorNASA - National Aeronautics and Space Administration (NASA)
date accessioned2017-09-04T18:17:42Z
date available2017-09-04T18:17:42Z
date copyright07/26/2001
date issued2001
identifier otherHPUEQCAAAAAAAAAA.pdf
identifier urihttp://yse.yabesh.ir/std;jsery=autho162s7D8308/handle/yse/200495
description abstractDescription of Driving Event:
In the 1995-1996 timeframe the Office of Safety and Mission Assurance (OSMA) began development of a distance learning capability under the umbrella of the Professional Development Initiative (PDI). This distance learning capability eventually evolved from a Safety and Mission Assurance discipline system into the Site for Online Learning and Resources (SOLAR) (http://solar. msfc.nasa.gov), currently one of NASA's primary distance learning resources. The intent of the initial development effort was to design and implement a prototype system for the Safety and Mission Assurance discipline. Since web-based distance learning was an untried technology within NASA, the development team determined that before initial system release a User Feedback Test was required to exercise the system and identify problems from a user perspective. (Since this test was conducted under expected operating conditions using a portion of the expected user base it could also be considered an operational test).
Not surprisingly the test results identified a series of changes that should be made to the system before production release. These included:
• Adding acronym and test buttons to the system navigation aids frame.
• Improving the legibility of embedded forms by using Portable Document Format files
• Providing clear indications of what each hyperlink identified within the course work does (glossary vs. more course material vs. reference, etc.)
While identification of changes to features and other improvements of that type were expected outcomes of the User Feedback Test we were not prepared for one of the findings that conflicted with one of the primary assumptions we'd made as we initiated design. We had assumed that even though the Internet and the world-wide-web were relatively new, most of the NASA workforce would have enough familiarity and capability to use the web-based distance learning capability without additional assistance. The test results indicated otherwise, we had a wide spread of web user capability from experienced to novice and if we wanted the web-based distance learning capability to succeed on a wide basis within NASA additional support would be required. (As a note while most new users/potential users now have enough basic internet/web skills to navigate the SOLAR web-site we still experience some difficulties when users are required to perform non-routine web activities, for example downloading and integrating plug-ins required for system use. Therefore the introduction of new capabilities that require non-routine web activities needs to be well thought out and should include arrangements for support of users.)
In response to the results of the User Feedback Test an upgrade to the system was provided that included a more extensive tutorial "Web-Workshop" to the user help features. This "Web-Workshop" was oriented to the novice web user and included topics such as Browser Basics, Screen Resolution, Plug-ins and Helper Applications as well as an Explanation of System Features. Examination of the User Feedback Test results also was a factor in including hands-on system demonstrations as a part of the system roll out and marketing campaign.
A secondary outcome from the test was the identification that the pace of individual users taking the training varied much more than we had originally anticipated. Our original plan was to identify a "suggested" time to complete each module based upon an "average" user; however, after noting the wide variation in the time required to complete modules we decided to play-down the "suggested" times so as not to stigmatize slower self-paced learners.
languageEnglish
titleNASA-LLIS-0976num
titleLessons Learned – User Feedback (Operational) Testingen
typestandard
page3
statusActive
treeNASA - National Aeronautics and Space Administration (NASA):;2001
contenttypefulltext
subject keywordsInformation Technology/Systems
subject keywordsPolicy & Planning
subject keywordsTest & Verification
subject keywordsTraining Equipment
DSpace software copyright © 2017-2020  DuraSpace
نرم افزار کتابخانه دیجیتال "دی اسپیس" فارسی شده توسط یابش برای کتابخانه های ایرانی | تماس با یابش
yabeshDSpacePersian
 
DSpace software copyright © 2017-2020  DuraSpace
نرم افزار کتابخانه دیجیتال "دی اسپیس" فارسی شده توسط یابش برای کتابخانه های ایرانی | تماس با یابش
yabeshDSpacePersian