Annual Portfolio Assessment Expert Review Pilot Technology Portfolio Panel Summary Report Prepared by: New Editions Consulting, Inc. 6858 Old Dominion Dr., Suite 230 McLean, VA 22101 January 4, 2006 Annual Portfolio Assessment Expert Review Pilot Technology Portfolio Panel Summary Report Table of Contents Executive Summary 3 Findings and Recommendations 3 Annual Portfolio Assessment Expert Review Technology Portfolio Pilot Panel Summary Report 6 Section 1: Overview of the APAER Process 6 1.1 APAER Purpose 6 1.2 Pilot Objectives and Design Challenges 6 1.3 Procedures 7 Section 2: Portfolio Performance 11 2.1 Cluster Level Results 11 2.2 Portfolio Level Results 13 Section 3. NIDRR Management of Portfolio 18 3.1 Portfolio Level Results 18 3.2 Cluster Level Portfolio Feedback 23 Section 4. Feedback on APAER Process 25 4.1 Portfolio Level Results 25 4.2 Cluster Level Comments 25 Section 5: Summary of Discussion 27 5.1. Implications for the Technology Portfolio 27 5.2 Implications for the APAER Process 27 5.3 Grantee Comments 28 APPENDIX A: NIDRR PERFORMANCE MEASURES 29 OSERS: National Institute on Disability and Rehabilitation Research: RA Title II FY2006 30 APPENDIX B: PANEL MEMBERS 38 Panelists 39 Facilitators 41 NIDRR Staff 41 Office of Management and Budget (OMB) Staff 43 U.S. Department of Education (DOE) Staff 43 New Editions Staff 43 APPENDIX C: AGENDA 44 Agenda 45 APPENDIX D: ACCOMPLISHMENT NUGGETS BY TECHNOLOGY CLUSTERS 48 APPENDIX E: PARTICIPANT EVALUATION SUMMARY 56 Participant Evaluation Summary 57 Executive Summary The National Institute on Disability and Rehabilitation Research (NIDRR) conducted its Annual Portfolio Assessment Expert Review (APAER) Process Pilot for the Technology Portfolio on October 5-6, 2005, in Arlington, VA. The APAER process was developed by NIDRR to assess the agency's progress in meeting Federal performance requirements under the Government Performance and Results Act (GPRA) and the Program Assessment Reporting Tool (PART). The PART is a systematic method of assessing and improving program performance across the Federal government, instituted by the Office of Management and Budget (OMB). The APAER process was designed as an external expert review of NIDRR grantee accomplishments using a three-year cycle. NIDRR's Technology portfolio was reviewed as a pilot of this new process. A panel of 17 researchers, consumers, clinicians, educators, advocates, and industry representatives reviewed NIDRR's technology portfolio, based on reports from 46 grantees with active awards in 2004 under the Rehabilitation Engineering Research Center (RERC), Disability and Rehabilitation Research Project (DRRP) and Field Initiated Project (FIP) programs. The review covered: * Objective 1: The quality and relevance of NIDRR-funded research, and the extent to which outputs and outcomes are contributing to the agency's long-term performance measures and strategic goals. * Objective 2: The strengths and weaknesses of the technology portfolio as a whole, including recommendations to improve the portfolio. And, * Objective 3: The quality and relevance of the agency's management of research directions and award decisions. Since this was a pilot process, data were also collected to assess the feasibility of the process design and implementation. Findings and Recommendations The panel provided frank evaluations and a substantial list of recommendations to NIDRR in three areas: the Technology portfolio performance, NIDRR management, and the APAER process. In general, the panel expressed concern about the lack of apparently high quality accomplishments described in the accomplishment "nuggets" submitted by the grantees. However, the panel indicated that these findings may be more reflective of difficulties with the APAER pilot process, rather than the actual performance of the grantees, and recommended that NIDRR obtain more accurate information from the grantees. Therefore, care must be taken in interpreting and applying these findings. Portfolio Performance. Panelists identified and discussed 49 outputs and 26 outcomes. While a number of exciting accomplishments were identified in each cluster area, the panel agreed that overall, there were not enough high-quality research and development outputs and outcomes that advance scientific knowledge, considering NIDRR's level of investment. The panel noted the lack of sufficient evidence supporting the claims made in the reports it examined. Among the high quality accomplishments identified, the strongest were contributions to policy and practice improvements. The Information Technology/Telecommunications cluster of awards was acknowledged as the strongest area in stimulating changes in policy and practice through the development of standards and regulations. The Environmental Access and Sensory/Communication areas also included some key accomplishments in policy and practice improvements. Similarly, in the area of knowledge translation, the panel agreed that there were not enough high-quality consumer-oriented products being produced or new devices or technologies being transferred to industry. The Information Technology/Telecommunications area was cited as strong in product development and transfer. The grantees in that cluster had achieved a good alignment with industry resulting in advances in accessible information technologies. In other clusters, there were an array of new products, but the potential for commercialization was unclear from the available information. On measures related to building capacity in the field, such as use of multidisciplinary teams and research training, the panel was unable to make definitive judgments of adequacy. The panel was uncertain about the level of multidisciplinary research being conducted among grantees and suggested a need for additional details about staff and consultants. Also, the panel was uncertain if there were an adequate number of awards to individuals from diverse backgrounds or institutions serving underrepresented populations. However, the panel recommended that NIDRR improve dissemination of funding opportunities to diverse populations to increase the number of applications from this group. Management of the Portfolio. The panel assessed the structure and relevance of the portfolio. Panel members tended to be uncertain about the adequacy of funding and believed that NIDRR's budget and grant amounts limit what NIDRR and its grantees can accomplish independent of collaborations with other agencies or industries. The majority of panelists did not believe that there were a sufficient number of awards that integrated education research and training. The panel generally agreed that the current portfolio topics were relevant and appropriate, but not cutting-edge. However, they recommended areas to add to or strengthen in the portfolio: technologies for aging, mobility concerns such as balance and falls, product effectiveness research, applications of nanotechnologies, and virtual reality. The panel acknowledged the dilemma regarding the balance of cutting-edge, innovative work with more direct problem-solving research. There was a general feeling that perhaps some smaller percentage of NIDRR's efforts should focus on cutting-edge innovations, with the majority of the work targeting the applications of mainstream technologies and the utilization of technologies by individuals with disabilities. The panel recommended that NIDRR work with an advisory group to reconfigure some of its funding mechanisms and reconsider the role played by the RERCs. (Note: NIDRR has already begun this process.) The panel also assessed NIDRR management activities. The panel offered recommendations for streamlining the NIDRR bureaucracy, reconsidering the role of the project officer, and improving NIDRR's information dissemination. The panel suggested that NIDRR reduce administrative requirements on grantees and become more nimble in working with grantees to apply the most up-to-date technologies. The panel felt that the role of NIDRR staff could be enhanced by assigning project responsibilities according to scientific affiliations, and supporting them in staying current by allowing them to attend leading conferences in the field and make site visits. Information about NIDRR's research could be improved, for example, by keeping an up-to-date Web site and providing publications that do not need to be vetted. The panel also saw a need for NIDRR to be more actively engaged in outreach to consumers, researchers, other agencies, and industry. APAER Pilot Process. The panel expressed a number of technical difficulties with the pilot process. There was limited information in many individual grant reports, the terminology used in the performance measures was complicated, the scoring process was interpreted differently in different clusters, and some information was provided too late or was inaccessible. Conceptually, the panel was concerned that reviewing projects only once every three years might miss important discoveries and advances. The panel recommended that APAER: * Show more clearly and directly the connection to NIDRR's long-term goals and strategic plan, * Be based on grantee performance reports that contain more accurate information, * Involve project officers to a greater extent to guide grantees and evaluate accomplishments prior to independent review, * Use simpler questions for greater consistency in responses, * More clearly define terms, including standards for "adequate," definitions of outputs and outcomes, and expectations for ratings using the scale 0-3, and * Generally involve less paper and be simplified. Annual Portfolio Assessment Expert Review Technology Portfolio Pilot Panel Summary Report The National Institute on Disability and Rehabilitation Research (NIDRR) conducted its Annual Portfolio Assessment Expert Review (APAER) Process Pilot for the Technology Portfolio on October 5-6, 2005, in Arlington, VA. This summary report presents a brief description of the APAER process and the findings and recommendations of the expert panel. Section 1: Overview of the APAER Process The APAER process was developed by NIDRR to assess its progress in meeting Federal performance requirements under the Government Performance and Results Act (GPRA) and the Program Assessment Reporting Tool (PART)1, and to capture progress of its grantees using a three-year cycle, with one-third of NIDRR's portfolio reviewed every three years. A team of experts reviewed NIDRR's Technology Portfolio as a pilot of this new process. This first year is a baseline year for the technology portfolio. Through this pilot, NIDRR intends to gather data to compare its baseline performance with results at the end of a specified time period in 2013. The goal and challenge to NIDRR with APAER were to establish an integrated and methodologically sound portfolio assessment and expert review process that would: * Be manageable with existing resources and consistent with emerging standards of practice for performance assessment in Federal R&D agencies, * Result in consistent and reliable data on NIDRR's long-term performance measures, and * Generate evidence-based recommendations for program improvement, consistent with the accountability and program management requirements of both GPRA and PART. 1.1 APAER Purpose The APAER is intended to provide NIDRR with a programmatic level, independent assessment of: * The quality and relevance of NIDRR-funded research and the extent to which outputs and outcomes are contributing to the agency's long-term performance measures and strategic goals, * The strengths and weaknesses of the technology portfolio as a whole, including recommendations to improve the portfolio, and * The quality and relevance of the agency's management of research directions and award decisions within the portfolio. 1.2 Pilot Objectives and Design Challenges The specific objectives of the 2005 APAER pilots were to test: * Strategies to collect reliable and evidence-based information on significant outputs and outcomes, and * The feasibility of the design and implementation of this portfolio level assessment of grantee performance. In designing and conducting this pilot, NIDRR experienced a number of challenges. The key challenges follow. * Federal agencies have not yet agreed upon guidelines or methods for reporting outcomes and conducting portfolio level assessments, and NIDRR is one of only a handful of agencies attempting this approach. * The timing is premature for NIDRR in that the outcomes-oriented performance reporting system for its grantees is currently under development. * Grantees completed a Supplemental Information Form designed to collect outcomes-oriented data at the same time that their regular Continuation Reports were due. * Data collected from grantees in the Supplemental Information Form were sparse. * Grantees received little technical assistance to support outcomes planning and reporting, and therefore they lacked knowledge (and practice) about how to report accomplishment nuggets, outputs and outcomes. 1.3 Procedures NIDRR's annual and long-term performance measures served as the basis of the process. These OMB-approved measures are listed in Appendix A. Key design features and steps in the process are summarized below. 1.3.1 Portfolio Composition For APAER, NIDRR grouped eligible awards into portfolios according to the applicable outcome arena of the Logic Model presented in its 2005 Long-Range Plan. For the research and development arena, NIDRR further subdivided these awards into six domains according to its Long-Range Plan. The criteria for inclusion of an award in the Technology portfolio were: technology-related focus; active in 2004 and had completed at least one full year of work; and, one of the following eligible program funding mechanisms--Rehabilitation Engineering Research Center (RERC), Disability and Rehabilitation Research Project (DRRP), or Field Initiated Project (FIP). For the Technology portfolio pilot, NIDRR identified 61 eligible awards. To facilitate the review process and reduce reviewer burden, NIDRR categorized the awards into five relatively similar sized clusters: (1) Sensory/Communication, (2) Mobility/Manipulation, (3) Information Technology/Telecommunications, (4) Environmental Access, and (5) Cross-Cutting awards. 1.3.2 Inputs to the Process NIDRR developed an online supplemental information form for grantees to report retrospectively on outputs and outcomes between 2002-2004 with data corresponding to selected GPRA performance measures. Grantees had six weeks to complete the form, including submitting electronic documentation of outputs and outcomes to the extent possible. They provided information on the multidisciplinary teams of investigators, study descriptions, a roster of fellows/trainees and doctoral students, journal publications, and nominations of two "best" accomplishment nuggets in scientific publications, short-term outcomes with evidence, and intermediate outcomes with evidence. NIDRR requested that participating grantees provide feedback on this process. Individual award reports, developed by grantees as they completed their online data entry, served as the basis for the individual level review. Panel members received electronic copies of these reports one week prior to the APAER meeting. Immediately prior to the APAER meeting, panel members also received an additional listing with links to available publications. These links enabled reviewers to review publications that grantees had reported, but not submitted, with their supplemental information. NIDRR provided copies of these publications at the APAER meeting. To help prepare panelists for the review, NIDRR also circulated to panel members a background report describing the APAER process, providing information about NIDRR and the technology domain, and reporting the aggregate data from the individual grantee awards. The aggregate data included information on funding mechanisms (20% of the portfolio were RERCs, 20% were DRRPS, and 37% were FIPs; there were no RRTCs); years of operation (the average was 3.4 years); number of awards with at least one previous cycle (41% of the grantees who provided reports); study samples; trainees by discipline; number of journal publications by funding mechanism; number of articles published (a total of 96); number of nuggets grantees reported as publications (46) by funding mechanism; number of short term nuggets reported (42); and number of intermediate nuggets (33). This information was designed to give the panel a broad overview of NIDRR and a general sense of the accomplishments of the Technology portfolio. Additionally, NIDRR provided panel members with a supplemental report to help them assess NIDRR management. This report contained information on NIDRR published funding priorities for a sample year, 2003; a compilation of 60 comments from consumers related to technology needs; an overview of the NIDRR peer review process; selection criteria for each type of funding mechanism; data on peer review scoring for FY 2003 competitions; information on the timeliness of the process; and panel characteristics. Reviewers accessed programmatic and logistical information as it became available through a Web site designed for this purpose. (http://www.neweditions.net/APAER2005/) 1.3.3 Panel Composition NIDRR developed guidelines for panel composition to ensure appropriate representation and expertise. The main criteria were: * A mix of "senior-level" scientists, clinicians/practitioners, educators/administrators, policy experts/Federal partners, industry representatives and consumer advocates, * A balance of disciplines, types of institution, geography and individual diversity, * A majority of non-NIDRR grantees, * A minimum of two researchers per cluster, and * No conflict-of-interest defined as direct financial or in-kind relationships. Seventeen reviewers--eight researchers, two university administrators, two clinician researchers, two representatives of consumer organizations, and three industry representatives--participated on the Technology panel. The expertise of the researchers included the social sciences, aging, prosthetics and orthotics, transportation, sensory disabilities, accessibility, and engineering. (See Appendix B.) 1.3.4 Assessment Process Panel members participated in two types of assessments: assessment of individual grantee performance and a portfolio level assessment. By reviewing the individual grant reports, panel members obtained necessary background information to assist them in judging the overall quality, relevance, and performance results of NIDRR's entire Technology portfolio. For APAER, NIDRR intended this information to assist panelists in their review of the overall portfolio, rather than the performance of individual grantees. Individual grantee assessment. The panel received instructions for scoring the individual reports using the Scoring Form for Individual Awards and Accomplishment Nuggets, with an example of a completed form. Using this form, three panel members independently reviewed each report. They were instructed to identify the best nuggets in the report, based on scoring, and select one nugget at each of three levels: (1) short-term outcomes, (2) intermediate outcomes, and (3) scientific accomplishments. Each panelist scored between seven and nine grant reports prior to the two-day meeting. At the on-site cluster meeting, NIDRR instructed panelists to: 1. Discuss individually identified nuggets to generate an inventory of agreed upon accomplishment nuggets. 2. Give the nugget a brief title to describe it. 3. Establish whether the accomplishment nugget was an output or outcome. 4. For outputs, indicate the type and the stage of knowledge development. 5. Provide a rationale or justification for the rating. NIDRR asked panelists to rate nuggets on a scale of 1-3, with "1" being a minor contribution and "3" being an outstanding contribution. They used a rating of "0" for "unable to determine." For these ratings, NIDRR instructed panel members to: * Consider outputs as the direct results of an activity. * Rate an accomplishment as a short-term outcome only when an evidence-based claim clarified how the accomplishment was contributing to the advancement of knowledge. Examples included citations, requests for reprints, or how publications weres having an effect on advancing knowledge. * Rate accomplishments as intermediate outcomes if grantees demonstrated that they were using knowledge to create change. Portfolio level review. NIDRR developed a set of questions to address: the performance of the portfolio based on NIDRR's GPRA measures and PART requirements; NIDRR management activities; and the APAER process itself. These questions guided the APAER portfolio level review. 1.3.5 Meeting Agenda The panel met for two full days, with panelists divided into cluster review groups on the afternoon of Day 1. The agenda is included in Appendix C. The panel meeting was facilitated by Dr. Duncan Moore, a professor of Optical Engineering, Biomedical Engineering, and Business Administration in the William E. Simon Graduate School of Business Administration at the University of Rochester and former Associate Director for Technology in the White House Office of Science and Technology Policy. On Day 2, Dr. Susan Daniels, an independent consultant and former Deputy Commissioner for Disability and Income Security Programs at the Social Security Administration assisted as a "process facilitator." Contract staff provided technical assistance to the panel and logistical support. NIDRR staff participated as presenters, cluster liaisons, and observers. As cluster liaisons, NIDRR staff were available in each room to facilitate the process, primarily by reminding panel members of their charge and the recommended steps to follow. No NIDRR staff were present during the final closing session of the meeting. The meeting opened with introductions and presentations by NIDRR staff. Steven Tingus, Director of NIDRR, provided opening remarks. Dr. Richard Melia, Director of the Research Sciences Division, gave an overview of NIDRR's mission, three primary goals, and highlights from NIDRR's new Long-Range Plan. Ruth Brannon, Associate Director of the Research Sciences Division, presented background information on NIDRR's Technology portfolio and research directions. She also described the funding mechanisms, average funding levels, and historical detail on total funding for the technology domain. Since 1995, this amount has tripled from about $11 million in 1995 to $32 million in 2004. During that time, the budget for the RERCs increased from $11 million to a high of $20 million in 2002, with $18 million budgeted for 2004. The budget for the FIPS also increased from $876,000 in 1996 to $3.7 million in 2004. Ms. Brannon welcomed feedback on portfolio issues, including NIDRR's role in translating innovations to services and products that reflect the needs and concerns of individuals with disabilities, including those with specific needs (e.g. aging with disabilities), developing evaluation tools, and targeting scientific discoveries in areas such as the brain-computer interface, bions, and smart homes. Dr. Margaret Campbell, Coordinator of Evaluation, reviewed the concept of the 2005 pilot, its relationship to PART, and how APAER was designed. She explained the procedures for the meeting, presented definitions for accomplishments, outputs, and outcomes, and clarified NIDRR's expectations of the panel. Section 2: Portfolio Performance 2.1 Cluster Level Results After meeting separately, each cluster reported its findings to the entire panel, including the number of outputs, examples of the best accomplishment nuggets, and impressions of the quality of these outputs. The aggregated data from these reports follow. 2.1.1 Results The APAER technology panel examined reports from 91% of the awardees in the Technology portfolio. In the five cluster meetings, panelists identified and discussed a total of 49 outputs and 26 outcomes. Forty-five percent of the total technology awards reviewed were RERCs, 35% were FIPs, and 20% were DRRPs. Panelists did not identify any accomplishment nuggets for an average of almost one-quarter of the awards reviewed; of the remainder of the awards, an average of two nuggets per award were reviewed in the five cluster level meetings. The aggregated data for the number, type and quality rating of nuggets for each cluster and averages across clusters are displayed in Table 1. For the awards, 16% of the scores reported by individual reviewers were rated as "0," unable to determine; the most frequently reported score was a "2" (making or likely to make a substantial contribution). Overall, only 11% of the individual scores suggested that an awardee was making or highly likely to make an outstanding contribution (a score of "3"). The Environmental Cluster, the group with the highest percentage of RERCs (77%), reported the highest scores, with individual reviewers rating an accomplishment as a "3" 45% of the time. In sharp contrast, the Cross-Cutting Cluster did not identify any accomplishments as "making or highly likely to make an outstanding contribution." Table 1 Accomplishment Nuggets per Cluster Cluster 1 Cluster 2 Cluster 3 Cluster 4 Cluster 5 Average Sensory Mobility IT/Tele EnvironAccess1 Cross-Cutting Composition of the Cluster # of Grant Reports Reviewed 9 10 9 9 9 9.2 Total # of Grantees 11 11 11 9 9 10.2 Percentage Grantees Reporting and Reviewed 82% 91% 82% 100% 100% 91% # of FIPs* 5 4 4 2 2 3.8 # of RERCs* 3 6 5 7 1 5 # of DRRPs* 1 0 2 0 6 1.4 Accomplishment Nuggets Total # of Outputs 4 9 9 17 10 9.8 Total # of Outcomes 9 4 6 7 3 5.8 Range of # of Nuggets/ Award 0-4 0-4 0-4 0-4 0-4 0-4 Ave. # per Award 1.7 1.8 2.4 2.7 1.4 2 Percentage of Awards without nuggets 22% 30% 22% 22% 22% 24% Scoring2 Individual scores of "0" 6% 0% 34% 20% 21% 16% Individual scores of "1" 12% 25% 10% 10% 27% 17% Individual scores of "2" 53% 54% 34% 25% 52% 43% Individual scores of "3" 29% 22% 22% 45% 0% 11% Ave. Score perNugget (1-3) 3 1.9 2 2.2 2.4 1.7 2.04 1The Environmental Access cluster matched nuggets to categories 1-5 on the individual scoring sheet with the multidisciplinary nugget eliminated for this review of outputs and outcomes. 2 Percentages reflect the percentage of scores 3 Scores of "O" - unable to determine - were not included 2.1.2 Inventory of Accomplishment Nuggets Prior to the cluster level meeting, each panelist had identified up to two "best" nuggets each for short term outcomes, intermediate level outcomes, and scientific accomplishments. At the cluster level meetings, the panelists decided which of these nuggets merited joint review. A complete list of accomplishment nuggets identified and reviewed by each cluster is presented in Appendix D. 2.1.3 Cluster Level Reports to Panel During the APAER panel assessment, highlights of findings and impressions from each cluster meeting were reported. Key areas of agreement from each cluster are summarized below. Sensory/Communication Cluster. The group reported that awardees produced many policy outcomes. One example was Smith Kettlewell's impact on Social Security legislation. The RERCs tended to produce more nuggets than the FIPs or DRRPs. Mobility/Manipulation. The group cited several projects with important outcomes, based on evidence of presentations at international conferences, publication of guidelines for clinicians, and product development. The cluster seemed to need more involvement by practicing clinicians and consumers to ensure its translation. The group suggested that NIDRR grantees focus more on translation of technologies than on inventing new technologies. Information Technology/Telecommunications Access. The panelists concluded that through these projects, NIDRR has contributed significantly to the development of standards and regulations, an ongoing need as technology evolves. The group cited projects with evidence of corporate linkages; however, they believed the cluster could be more innovative by researching emerging technologies in the IT area, and considering higher-risk projects, perhaps through FIP/RERC partnerships. Publication accomplishments for AT appeared reasonable. It was uncertain if the number of publications on IT was adequate. The group had discussed at some length how to attract graduate students, noting that the funding for graduate students was insufficient. They suggested that NIDRR consider how to bring researchers from other areas into disability settings, perhaps establishing an Advanced Rehabilitation Research Training (ARRT) project for this purpose. Environmental Access. This cluster had a larger number of RERCs. Panelists noted that although these RERCs produced a fair number of outcomes and outputs, including some sustainable outcomes, it takes many years for a project to produce outcomes and they were concerned that the RERCs did not provide hard evidence to support their claims. Some multi-year RERCs had reported only one training; given the size of the grant, this was considered unacceptable. In general, they were uncertain how to rate the level of productivity in this cluster, and suggested a review mechanism to ensure a "return on investment." Perhaps projects could be monitored and given a warning with one year to improve. The group also recommended engaging the private sector in developing customer-oriented publications, including publications for consumers, service providers and agencies. They indicated that it was crucial to identify customer needs and that the grantees must have a comprehensive understanding of them, particularly for human service applications. They also recommended that standards be developed to determine the usefulness of new technologies, especially regarding interoperability. Crosscutting. The group was impressed with two of the nine reported accomplishments, including a model tracking system that was adopted by the Medicare program. The group had some concerns regarding how the accomplishments were divided into single or multiple nuggets for rating. 2.2 Portfolio Level Results After the cluster reports, the moderator asked panelists to consider all of the information presented as they evaluated the portfolio as a whole in relationship to specific NIDRR long-term performance measures and the PART criteria for R&D investment for quality, relevance and performance. Using a nominal process to stimulate discussion, the facilitators guided a substantial and thoughtful review of the Technology portfolio. Panelists used response cards for "agree," "disagree," and "uncertain" to indicate their judgment for each question. These responses were not meant to serve as quantitative data, although in some instances the responses were counted as an indication of majority opinion. Key agreed upon points or themes are presented below. Comments that reflect one panelist's opinion are indicated as such. NIDRR developed a series of questions to elicit feedback on the progress of the portfolio on selected GPRA long-term performance measures that relate to the NIDRR goals and objectives in capacity building, research and development and knowledge translation. The first NIDRR strategic goal is to increase capacity to conduct and use high-quality and relevant disability and rehabilitation research. The following three questions address specific objectives for this area. Agreed. Among panelist who agreed, one commented that all the RERCs have a potpourri of expertise and both the DRRPS and RERCs typically have multidisciplinary teams. Also, data are needed on consultants. Panelists speculated that more disciplines may be involved but are not being paid as staff on the grant. Panelists suggested that the focus should be on whether the right team was in place for the nature of the project. Disagreed. Panelists who disagreed indicated that the award reports lacked the needed information, and their clusters included only a few multidisciplinary projects. They suggested that more practicing clinicians were needed and recommended that multidisciplinary research be encouraged. Uncertain. Those who were uncertain indicated that they needed more information to make an adequate assessment. Agreed. Among the few panelists who agreed, one indicated that the focus should be where the best work is done and how to attract the best researchers. Disagreed. The eight panelists who disagreed indicated that NIDRR was not apprising minority institutions of funding opportunities well enough to encourage competitive applications. One panelist questioned the impact of knowledge transfer if information is not shared with a diverse audience. Uncertain. Those who were uncertain about this question noted a lack of information on minority institutions and that diversity was needed to provide the appropriate technology to those who need it. The panelists also noted that although science can be done in the "best" place to do the work, it would be useful to know the number of applications NIDRR receives from minority institutions. Discussion. The panel considered whether the number of applications from the targeted populations was adequate. One panelist suggested that even though not required by law to look at Experimental Programs to Stimulate Competitive Research (EPSCOR) states, it may be appropriate. These include states that historically receive the least amount of funding. NSF, for example, conducts workshops in rural states to elicit applications from states that receive the least amount of Federal funding. Agreed. A few panelists agreed that there were an adequate number of awards integrating research, education, and training. One panelist thought that there was a fair number of trainees in the environmental access cluster, but was skeptical about the results. Another indicated that the mobility cluster had a fair number of trainees. Disagreed. Those who disagreed expressed concerns about the lack of integration with industry, uncertainty regarding the adequacy of the training mix, and the lack of an adequate number of graduate students. Uncertain. Concerns of those who were uncertain centered on the lack of information provided by awardees to address this issue. They suggested that NIDRR encourage cross training of multidisciplinary groups, develop courses and approach training from a broader perspective to include pre-service as well as inservice training. The second NIDRR strategic goal is to generate scientifically based knowledge, technologies, and applications to inform policy, change practice and improve outcomes. Two questions address objectives under this goal. Agreed. Two panelists agreed with this statement. One indicated that some of the grantees had other outcomes and outputs that were not reported in the materials reviewed, and that the portfolio should be modified, not completely refurbished. Another agreed, but suggested that more of the high quality publications were in the policy arena rather than in R&D. Disagreed. Twelve panelists disagreed with this statement. They cited concerns that there seemed to be an inadequate number of high quality projects, and a focus on output rather than outcomes. They suggested site visits to understand accomplishments, and major gaps. One panelist noted a need to focus on other mobility needs than currently funded. Uncertain. Those who were uncertain questioned the definitions of "adequate" and "high quality." Agreed. All, with the exception of two panelists, agreed that there were outcomes and outputs contributing to policy and practice improvements, particularly in the IT area, with supporting evidence. Disagreed. One panelist thought that there was room for improvement, since not all grantees showed evidence of meaningful changes in this area. Another added that while some projects were making good attempts at significant outcomes, none provided data that showed change in practices. The third NIDRR strategic goal is to promote the effective use of scientific-based knowledge, technologies, and applications to inform disability and rehabilitation policy, improve practice, and enhance the lives of individuals with disabilities. The following two questions correspond to specific objectives in this area. Disagreed. Almost all disagreed, with two undecided. The panel indicated that they did not see enough consumer-oriented products. They were concerned that products to affect policy could get lost, and there was no mention of the role and work of the RRTCs2. The panel also expressed concerns about the quality of the products, including software products, since there was no proof provided to validate or verify quality. Uncertain. One panelist could recall only one consumer-based publication in his cluster and was not familiar enough with the other clusters to rate their outcomes. Agreed. Of the four who agreed, one panelist noted the significance that all four of the IT/Telecom cluster group members agreed that those grantees were transferring findings to industry, including success in obtaining patents, although admittedly hard to do. The other clusters did not see that level of transfer. Disagreed. Most of the panel disagreed. One panelist who disagreed commented that his cluster's grant reports were largely from RERCs, and that he was not confident in their commercialization. Uncertain. Two panelists were uncertain. One panelist commented from personal experience and from the cluster he reviewed that there is a healthy list of products, but he was unaware of the quality or market cap. A good partnership with industry is needed to improve technology transfer and commercialization. One panelist was concerned about the sufficiency of market share for products, asking, and "if only one percent of the market will use it, where is the bang for the buck?" Section 3. NIDRR Management of Portfolio Assessments of NIDRR management were offered on the cluster and portfolio levels. At the cluster level, panelists made judgments about each cluster after reviewing, rating and discussing a set of individual grantee reports. The portfolio level assessment was based on reports from each cluster. Discussion focused on portfolio research directions and NIDRR management activities. 3.1 Portfolio Level Results Facilitators led the group discussion using questions developed by NIDRR. Not all questions were addressed due to a lack of time. Group judgments are reported below for research directions and NIDRR management activities. Reaction was split. Agreed. Among those agreeing, panelists found difficulty understanding the term "adequate." However, they felt that RERC funding levels were generally adequate. One member suggested they were adequate in that grantees matched their budgets to available funding. Disagreed. Among those who disagreed, there were two concerns. First, there was a sense that the large number of RERCs impacted the other areas that NIDRR can address. Second, the current level of funding does not translate into innovations in the technology arena or the application of technology to disability-related problems and needs. They suggested that since the amount of funding is not going to increase, NIDRR must focus funds to areas it wants to impact. Uncertain. Panelists who were unable to determine the adequacy of funding recommended that NIDRR consider program effectiveness by comparing outcomes produced by each funding mechanism. A "sensitivity analysis" was also recommended to determine the preferred balance of grants at both the cluster and portfolio levels. Agreed. The two panelists who agreed that the portfolio was relevant to the state of the science indicated that among the reports they found both projects focused on technology that they supported and some that they did not support. One panelist said that universities should be expected to excel in developing technologies and to be current with the state of the science. Disagreed. Twelve disagreed and considered this question inappropriate. Panelists were concerned that technologies for persons with disabilities were not robust enough to be considered truly cutting-edge. However, after discussion they concluded that to connect to practice, technologies do not necessarily need to be at the cutting edge. They recommended that several projects use social scientists to keep up with state of the science. For this question, the moderator conducted a general discussion, rather than polling for agreement/disagreement. The panel indicated that it is vital for NIDRR grantees to start by identifying consumer need, but not stop there. For projects addressing a broad range of issues and bringing products to the marketplace there may not be immediate consumer needs; however, consumer needs should be addressed later down the line. Consumer needs are not static. The panel noted that the issue of persons aging with disability is extremely complex, and an area that NIDRR is beginning to examine. The panel recognized NIDRR's delicate balance in promoting original research versus outcomes, and noted the similarity to the patient-centered outcomes being reviewed in the medical arena. NIDRR may wish to consider the health services research venue and how to integrate technology to meet these needs. Key group judgments on emerging issues from the discussion were: * End users want, need and deserve better research information about products they use, including device effectiveness. * Payment for new devices would improve availability. * If NIDRR covered the whole continuum, including the supply side, it might attract a new group of grantees. * NIDRR should acknowledge the idea of "parallel universes" (mainstream and disability-specific technologies), and have different standards for evaluating and rating these projects. The outcomes of a publication may not be the best measure of success. * NIDRR should consider its role regarding research on emerging technologies that reach consumers, and develop efficient models for building relationships with industry. Additional individual comments included: * NIDRR should have a role in helping graduate students work on a particular platform with AT vendors. The traditional AT vendor is not a technology vendor. * NIDRR should focus on aging issues. * To be successful, the process should move along a continuum that includes marketing and business opportunities. * Patents are not sufficient; good strategic liaisons are needed. However, these may not work outside a small domain. * The majority of AT devices are abandoned after a short period since there are no benchmarks on effectiveness. A set of quantitative benchmarks is needed. NIDRR is interested in determining the effectiveness of these products and if they are making a difference in people's lives. Group judgments on improving the portfolio were: * NIDRR needs to develop information on whether products actually work. Industry would like this. * NIDRR should address the full range of priorities for: (1) development, (2) research on effectiveness, and (3) transfer possibilities with vendors. * NIDRR could be more effective in providing research data to promote the highest standard of service related to technology, device and systems cost-effectiveness data. * How NIDRR determines its priorities and its funding mechanisms is important. NIDRR should set priorities based on what it learns from the field. 3.1.1 Closing Discussion During this final session, the panel discussed issues emerging from earlier discussions relating to a continuum of NIDRR interests, a process to identify critical gaps, and topics that should be discontinued. They also provided recommendations regarding NIDRR management of activities. NIDRR staff were not present. The panel discussed two examples that portray part of the dilemma in formulating a plan for measuring accomplishments: * One panelist described how an RRTC had been working with a national consumer association, sharing the results of their research, TA, and advocacy. However, the national association adopted a broad position that was not supported by evidence. * Another RERC produced practices that were adopted by a pilot study in one state, then reported nationwide. The panel was concerned with the balance between directing resources toward reporting RERC results and generating new knowledge. The panel also considered the value of effectiveness studies and the need for mass customization and low-cost services. They asked if the IT vendors could provide a model for commercialization in other cluster areas such as mobility. The panel offered no recommendations. One panelist suggested that NIDRR is in a tough bind, needing to be nimble and flexible, while articulating specific expectations and priorities. NIDRR is now planning for 2008, and it may be hard to shift priorities. The panel recommended that NIDRR consider the feasibility of: * Using the RERC mechanism to address gaps, * Earmarking seed money as other agencies have done (perhaps 10-15 % of the total for grantees), * Relying on new investigators and new FIP funding to address gaps, perhaps designing new investigator awards for new technologies and perspectives, and * Using SBIRs for this purpose. In considering the process to identify topics and trends, one panelist suggested that NIDRR hold a bidders' conference, similar to those held by DOD. Because the portfolio is so broad, the panel urged NIDRR to consider: * Establishing an advisory or steering committee, preferably composed of non-grantees, representing specific groups, with a diversity of consumer, industry, and science interests, to review the strategic plan on an annual basis; * Involving stakeholders, including industry, in setting strategic goals; and * Evaluating the state of technology annually. The panel reported that they were unable to decide what topics should be discontinued from this process. The panel felt strongly that some redirection was needed and that it should be connected to the short-term strategic plan. The panel: * While recommending that funding be redirected in the mobility cluster from a focus on wheelchairs and lower extremity mobility, recognized advances in newer, lightweight materials for these products. * In deciding whether to redirect funding, suggested NIDRR consider the audience, how they would use the technologies, or if the technologies would be used for new diseases. * Noted that the RERCs seem to have an unlimited life span. NIDRR needs help to redefine the role and mission of the RERCs. NSF, for example, limits centers to 10 years of funding. (Note that NIDRR is already examining the RERCs.) * Recommended that NIDRR reduce both the administrative burdens on RERCs and requirements for the length of proposals. Additional individual comments were: * One panelist commented that the significant RERC administrative requirements and expectations in operating the RERCs do not match their funding levels, and that their goals and objectives are not achievable with the funding solely from NIDRR. In contrast, NSF provides grants for millions of dollars for a 15-page response to a RFP. * In comparison to the approach used by NIDRR, one panelist noted that FIPSE (Funds for the Improvement of Post Secondary Education) required that a 7-page concept paper be vetted first. Then, selected applicants are invited to submit a full-length application. This provided a great way to generate a volume of ideas. * One panelist suggested the standard process of captioning should not require funding from NIDRR. Instead, funding should support new and emerging issues, such as FCC rulings for 100% captioning for TV by 2006, voice recognition, and internet streaming. * NIDRR should consider a process for external validation of priorities. The panel offered the following comments and recommendations regarding the nature of NIDRR management activities. Major themes revolved around NIDRR communication and collaboration with the field, translating research and enhancing NIDRR project officer involvement in management. The panel perceived that NIDRR had gone "underground." NIDRR does not get much publicity and does not communicate quickly with the field. It is difficult to find NIDRR information, even on its Web site. According to the panel, ongoing communication is needed between NIDRR and the field to provide current updates, and bring more visibility to the agency. The panel suggested that NIDRR reach out to consumers, researchers and industry with an inspiring vision, generating excitement, instead of responding to situations. NIDRR might consider using communication, perhaps electronic, that does not need to be vetted. The panel suggested NIDRR improve communications with grantees, be more directive and seek out grantees and collaborations. Similarly, the panel recommended that NIDRR improve communication and collaboration with agencies, including leveraging funding. To improve the translation of research, the panel offered management recommendations: * Develop surveys, and establish standards and criteria for successful translation of research. The panel noted that NIDRR does not have benchmark studies or tools to measure effectiveness or impacts on quality of life. * Since it lacks the resources needed to reach consumers, work with others to do so. NIDRR addresses applied science, yet is evaluated on whether findings are getting to consumers. * Review its involvement with participatory action research. When a Center applies for a grant, it must show consumer involvement in every stage of the project, to increase the likelihood of appropriate application of the results. The panel generally agreed that it would be a good management strategy to enhance the leadership role of the NIDRR project officer. First, it suggested NIDRR consider designating project officers by scientific area rather than by funding mechanisms. Second, it suggested NIDRR review how other agencies use project officers. For example, a rotating system for project officers, somewhat like NSF, might be useful. Or, consider how NIH project officers, who monitor a high number of awards, keep up with trends in the field. Finally, the panel suggested NIDRR encourage and fund project officer attendance at the top 10 conferences to stay current and "bring cross-fertilization." One additional comment to renew efforts to attract more and adequately fund graduate students was offered by the panel. The panel was concerned with the lack of graduate students and noted a previous process (that NSF is now considering) whereby NIDRR funding followed the student, rather than the university. 3.2 Cluster Level Portfolio Feedback While the focus of the cluster assessment was intended to be on portfolio performance, numerous comments were generated about research directions and NIDRR management activities. These comments reflect group judgments, not individual statements, and reflect general discussion by the APAER panel, as well as specific cluster reports. Sensory/Communication Cluster. No comments. Mobility/Manipulation. The cluster included a large number of grants in wheeled mobility and lower limb prosthetics, and not many in other areas of mobility and manipulation, with research priorities focused on external devices. The group suggested that NIDRR consider its role in supporting the development and dissemination of implantable joints and higher technology electronics. The group recommended that research areas be broader to cover additional important areas, including balance disorders, falls, and issues of disease and age-related conditions such as arthritis. A partnership with NIH could allow NIDRR to focus on getting new devices to the user. Information Technology/Telecommunications Access. The team recommended that for this area NIDRR consider using the generic term being used in Europe and among the innovators, "ICT - Information and Communications Technology." The team encouraged NIDRR to develop ways to bring new technologies into the disability communities. They were concerned that certain technologies are not applied in a disability setting and that it will take 10 years for nanotechnology and other technologies to move into the disability community. Specifically, they recommended that NIDRR contact Mike Rocco at NSF regarding collaboration on nanotechnology. There were gaps in the portfolio, but the group did not have sufficient information to address them. For example, new materials including implantable materials were not seen in the cluster. In general, the group felt that the disability community was not moving fast enough to influence this field and the emerging technology. The group was concerned about NIDRR's formulaic approach to implementing RERCs and DRRPs, given how quickly technologies change over the course of a five-year grant. The group felt that the grantees are meeting the specifics of their grant proposals, rather than taking advantage of new discoveries and opportunities. NIDRR should recognize the different approaches used in research versus development of innovative products, and encourage engineering methods. Environmental Access. The group questioned whether NIDRR's funding structure allowed for new and exciting ideas and noted that in the past NIDRR awarded one-year innovation grants. In discussing the funding mechanisms, they speculated that (a) while a FIP might lead to a DRRP, it may not be optimal; (b) there may or may not be a flow into a RERC or an RRTC, (c) there appears that the number of DRRPs may be too few, and (d) the RERCs do not have the leeway to innovate over their five-year grant period if they want to honor their initial commitment, particularly with the number of partnerships that are part of the RERC model. They noted that NIDRR has not funded "virtual reality," and recommended that NIDRR realign Field Initiated Projects for high-risk endeavors. The team also encouraged NIDRR to play a role in understanding developments in nanotechnology, noting its potential to revolutionize science. The group also concurred that the field needs more researchers and that NIDRR must take steps to attract graduate students, including students from other fields, with a 5-10 year trajectory. The group also recommended that NIDRR establish a national steering group to oversee the portfolio, a strategic plan for emerging technologies, and an advisory council to meet periodically to identify new technologies. Crosscutting. The group recommended that NIDRR focus on sending information clearly and succinctly. Section 4. Feedback on APAER Process Assessments of the pilot APAER process were offered by panelists at both the cluster and portfolio levels. At the portfolio level, after reports back on each cluster, panelists were asked to assess the entire portfolio process. At the cluster level, panelists made judgments based on their experience reviewing, rating and discussing a set of individual awards. 4.1 Portfolio Level Results During panel discussion, the following comments and recommendations on the APAER process highlighted the panel's concerns: * With the APAER 3-year review cycle, an internal review by the project officer (PO) should occur first. The PO should be more involved in helping grantees engage in self-evaluation and redirection of resources throughout the course of the project. * There was a disconnect in the role of the PO, creating an identity issue for NIDRR. For example, certain AT projects moved from NIDRR to RSA are now missing a link with research. * The APAER process was very complex. They recommended reducing the paper and simplifying this process. 4.2 Cluster Level Comments Several similar comments were made during reports from the various clusters. In general, the comments indicated that panel members sometimes had trouble accessing materials from grantees and difficulties with some of the links provided to publications. Moreover, they were troubled that grantees may not have had time to adequately complete the online report, or perhaps did not understand NIDRR's request. The panelists also found the amount of information on accomplishments provided by grantees to be inadequate and were disappointed to find that some grant reports did not have any nuggets. The panel recommended that NIDRR be more involved in helping grantees prepare reports prior to the APAER expert panel meeting. Panelists also experienced several scoring dilemmas. The clusters tended to devise their own standards for rating grant reports as a "3", "2," or "1." They were also uncertain how to rate publications that were dated 2005, given the directive to focus on research and publications through 2004. They were concerned that sometimes NIDRR was not acknowledged in publications. Although they felt that meaningful changes were being made, they questioned whether these were adequate, and how to define "adequate." One group indicated that they would have preferred to have each team member review all 9 grants rather than only a portion of the grants for that cluster. They also suggested that the term "multidisciplinary" be reserved for projects with more than two disciplines per team. Another group reported that they could not adequately evaluate a cluster of miscellaneous projects due to the diverse nature of of the projects. They found the discussion questions difficult to apply, so they proceeded with a discussion without using the questions. The cluster groups also suggested that: * The information requested from grantees should match what reviewers are asked to score and the criteria for their annual review process. For this to occur, standards and benchmarks need to be developed. In terms of a review process, NIDRR would be better served to differentiate research areas and develop specific types of evidence that could be provided routinely. NIDRR might be able to develop and put into a data system a list of potential measures for providing evidence. A standard process is needed for rating accomplishments and guidance for acceptable evidence. * NIDRR should simplify the questions for greater consistency across the requests for information. APAER as it stands is too complex; it seems to be a way to cover all the topics that NIDRR oversees, rather than assessing the impact of work that is being funded. The panel recommended rethinking the process and asking the grantees questions in clear English. * This process must be made easy for the people who report and for the reviewers. If the format was easier, some of the projects that did not provide information may have done so. The group suggested approaches used by other agencies. For example, the National Science Foundation (NSF) has a system of rotating Program Directors. The Defense Advanced Research Project Agency (DARPA) has a number of program officers (PO) who require nuggets every six weeks to produce a constant stream of information to stay on top of the results. Perhaps NIDRR could have their grantees continue thinking of nuggets to provide a constant flow of information back to the PO. The group recommended looking at other agencies and their distribution of awards and portfolios. NIDRR needs a model that tracks progress. * The panel was concerned that a 3-year review would not capture accomplishments that occurred in years when projects are not being reviewed. They suggested that NIDRR clarify this process. (Note: NIDRR intends to capture accomplishments on an annual basis, irrespective of the portfolio review cycle.) Also, in comparing nuggets across projects, NIDRR needs to pre-define the milestones on which they will be measured. This would allow NIDRR to focus the investigator on the long-range goal of transferring research into clinical practice, and help them review their progress in meeting their goals. The process needs to be customized for individual proposals. A summary evaluation about the APAER process completed by panelists, including individual comments is provided in Appendix E. Section 5: Summary of Discussion 5.1. Implications for the Technology Portfolio According to the APAER panelists, NIDRR may want to examine the structure, function and focus of the Technology portfolio to improve overall outcomes. NIDRR may want to explore how joint initiatives with other federal agencies can extend the impact of the somewhat limited funding that NIDRR has for research related to technologies for individuals with disabilities. The panelists suggested that, with the advice of a national advisory council, NIDRR may want to reconceptualize its role in the technology arena and decide, for example, to concentrate its efforts on developing modifications to mainstream technologies so that they are more usable by individuals with disabilities. Panelists also suggested that NIDRR may have a major role in encouraging collaborative research with graduate students from related fields so that there is more cross-fertilization of ideas and so that those in other disciplines begin to consider universal design of technology in their innovations. Limitations of the Findings. Certain statements made by panel members may more accurately reflect either the limited information they received or their limited knowledge or perception of NIDRR, rather than the actual circumstances within NIDRR or NIDRR's operations. Some statements may be inaccurate or misleading and NIDRR will need to consider these factors in its response to the report. NIDRR staff addressed some in editor notes in the text of the report. Other examples of potentially misleading statements noted by NIDRR staff include: * NIDRR should contact NSF regarding nanotechnologies. NIDRR is already involved in discussions with NSF and NIH regarding nanotechnologies. * Work more closely with industry in transferring technologies for commercialization. NIDRR is doing this through its SBIR program. * Obtain more input from consumers and the field. NIDRR employs a protocol for obtaining input from consumers and other stakeholders in preparing its Long-Range Plan and when new priorities or regulations are being considered. While each of the above examples demonstrates misinformation about NIDRR, the perceptions of the panel members are important and suggest that, at a minimum, the need to supply reviewers with additional information about NIDRR to facilitate the review process. Moreover, the misinformation may point to areas for further consideration by NIDRR, such as increasing public awareness about its programs, changes in certain policies and procedures, or redirection of funding. 5.2 Implications for the APAER Process In discussing the APAER process, the panel provided many recommendations to enhance the validity and efficiencies of NIDRR's process. These included a strong recommendation to increase the involvement of project officers in guiding grantees during the course of their grant implementation, as well as in preparation for the APAER review. Central themes that emerged from the panel in terms of the APAER process were to simplify the process, simplify and clarify the terminology used, and make sure that the reviewers had appropriate materials and adequate time to review them prior to coming together to consider the portfolio as a whole. Limitations with the Technology Pilot. As mentioned earlier, NIDRR is one of a few federal research agencies conducting an annual portfolio-level performance review. The newness of this approach presented some difficulties that should be considered in evaluating the findings of this pilot. There was uncertainty about how to proceed, but decisions were made based on available data from other agency models, reporting requirements, and knowledge of NIDRR's needs. A number of difficulties with the process were identified in advance of the meeting and others were identified on-site, suggesting that caution be used in interpreting the findings. These difficulties included: * There was no independent field-testing of instruments prior to their use in the pilot. * Grantee lack of knowledge and experience with the outcome measure terminology resulted in inconsistent and limited data collection. * The sample of best accomplishments was provided to the panelists by the grantees, without prior review by NIDRR staff. * NIDRR competition data about the peer review process was limited to 2003 competition data only as it was difficult to collect this information from NIDRR. However, the data were not used as the peer review process was not discussed at the meeting due to lack of time. * There was limited time for panelists to review materials. Some materials were only available on-site. Some links to materials provided by grantees were inaccessible. * Terminology was confusing to panelists. * Scoring of accomplishment nuggets was not done consistently across clusters. 5.3 Grantee Comments Grantees were given the opportunity to provide feedback on the supplemental information form and the online data collection process. Regarding the online data collection process, grantees seemed to concur that, conceptually, the separation of activities, outcomes, and outputs was useful and potentially powerful. However, grantees found that the amount of labor in completing the form was greater than anticipated and experienced some frustration in both attempting to understand directions and terminology, and navigating through the online form. Grantees also reported that they found it difficult to "nominate their best nuggets" and would have preferred to have the capability to add more data, rather than being restricted in how much data could be entered. APPENDIX A: NIDRR PERFORMANCE MEASURES OSERS: National Institute on Disability and Rehabilitation Research: RA Title II FY2006 CFDA Number: 84.133 - National Institute on Disability and Rehabilitation Research Goal 7: Special Education and Rehab. Services Internal Goal. Objective 7.1 of 2: Advance knowledge through capacity building: Increase capacity to conduct and use high-quality and relevant disability and rehabilitation research and related activities designed to guide decision-making, change practice, and improve the lives of individuals with disabilities. Indicator 7.1.1 of 2: By 2013, increase by at least 25% the number of new NIDRR grants awarded to multidisciplinary teams of investigators that represent a relevant balance of sub-fields within medicine, behavioral and social sciences, education, engineering, information sciences, and design. Targets and Performance Data The number of new grants awarded to multidisciplinary teams of investigators that meet the stated criteria. (Input-Oriented Capacity Building) Year 2006 Actual Performance [blank] Performance Targets 999 Assessment of Progress [blank] Sources and Data Quality Additional Source Information: Peer review of applications. Frequency: Annually. Collection Period: 2005 - 2006 Data Available: November 2006 Indicator 7.1.2 of 2: By 2013, increase by at least 25% the number of new grants awarded to minority serving institutions and to first time NIDRR investigators from underrepresented populations. Targets and Performance Data The number of grants awarded to minority serving institutions and to first time NIDRR investigators from underrepresented populations. (Input-Oriented Capacity Building) Year 2006 Actual Performance [blank] Performance Targets 999 Assessment of Progress [blank] Sources and Data Quality Source: Performance Report Grantee Performance Report: 1820-0642 Annual Performance Reporting Forms for NIDRR Grantees (RERCs, RRTCs, DBTACs, DRRPs, Model Systems, Dissemination & Utilization Projects). Additional Source Information: IPEDS for minority serving institutions. Additionally, grant applications for first time investigators. Collection Period: 2005 - 2006 Data Available: November 2006 Objective 7.2 of 2: Advance knowledge through translation and dissemination: Promote the effective use of scientifically-based knowledge, technologies, and applications to inform disability and rehabilitation policy, improve practice, and enhance the lives of individuals with disabilities. Indicator 7.2.1 of 1: By 2013, increase by at least 20% the number of non-academic and consumer-oriented dissemination publications and products, nominated by grantees to be their best outputs based on NIDRR funded research and related activities, that demonstrate ''good to excellent'' utility for intended beneficiaries. Targets and Performance Data The percentage of non-academic and consumer-oriented dissemination products and services, nominated by grantees to be their best outputs based on NIDRR funded research and related activities, that meet the stated criteria. (Outcome-Oriented Knowledge Translation) Year 2005 Actual Performance [blank] Performance Targets 999 Year 2006 Actual Performance [blank] Performance Targets 999 Assessment of Progress Explanation: Approximately 1/3 of NIDRR's grants, based on a judgmentally selected sample of grantee nominated ''non-academic, consumer-oriented dissemination publications and products'' are reviewed annually. The first completed three-year cycle of portfolio assessments will include FY 2005, 2006, and 2007. Sources and Data Quality Additional Source Information: Expert Review Panel. Frequency: Annually. Collection Period: 2005 Data Available: September 2005 Program Goal: To conduct high-quality research and related activities that lead to high-quality products. Objective 8.1 of 4: Advance knowledge through capacity building: Increase capacity to conduct and use high-quality and relevant disability and rehabilitation research and related activities designed to guide decision-making, change practice, and improve the lives of individuals with disabilities. Indicator 8.1.1 of 3: By 2013, at least 10% of all newly awarded projects will be multi-site, collaborative controlled studies of interventions and programs. Targets and Performance Data The percentage of new projects conducting multisite, collaborative controlled trials. (Output-Oriented Capacity Building) Year 2005 Actual Performance [blank] Performance Targets 999 Year 2006 Actual Performance [blank] Performance Targets 999 Assessment of Progress Explanation: This applies only to RERCs, RRTCs, Model Systems Grants, and DRRPs. Sources and Data Quality Additional Source Information: Staff review of grant applications. Collection Period: 2004 - 2005 Data Available: November 2005 Indicator 8.1.2 of 3: By 2013, at least 100 individuals from diverse disciplines and backgrounds will be actively engaged in conducting high-quality disability and rehabilitation research and demonstration projects. Targets and Performance Data The number of former pre- and postdoctoral students and fellows who received research training supported by NIDRR who are actively engaged in conducting high-quality research and demonstration projects. (Outcome-Oriented Capacity Building) Year 2007 Actual Performance [blank] Performance Targets 999 Assessment of Progress [blank] Sources and Data Quality Source: Performance Report Grantee Performance Report: 1820-0642 Annual Performance Reporting Forms for NIDRR Grantees (RERCs, RRTCs, DBTACs, DRRPs, Model Systems, Dissemination & Utilization Projects). Frequency: Annually. Collection Period: 2006 - 2007 Data Available: November 2007 Indicator 8.1.3 of 3: Percentage of NIDRR-supported fellows, post-doctoral trainees, and doctoral students who publish results of NIDRR-sponsored research in refereed journals. Targets and Performance Data Assessment of Progress Sources and Data Quality Objective 8.2 of 4: Advance knowledge through research and related activities: Generate scientific-based knowledge, technologies, and applications to inform policy, change practice, and improve outcomes. Indicator 8.2.1 of 6: By 2013, increase by at least 20% the number of discoveries, analyses, and standards developed and/or tested with NIDRR funding that have been judged by expert panels to advance understanding of key concepts, issues, and emerging trends and strengthen the evidence-base for disability and rehabilitation policy, practice, and research. Targets and Performance Data The number of discoveries, analyses, and standards developed and/or tested with NIDRR funding that meet the stated criteria. (Outcome-Oriented Research & Development) Year 2007 Actual Performance [blank] Performance Targets 999 Assessment of Progress Explanation: Approximately 1/3 of NIDRR's grants, based on a judgmentally selected sample of grantee nominated ''discoveries'' will be reviewed. The first completed three-year cycle of portfolio assessments will include FY 2005, 2006, and 2007. Sources and Data Quality Source: Performance Report Grantee Performance Report: 1820-0642 Annual Performance Reporting Forms for NIDRR Grantees (RERCs, RRTCs, DBTACs, DRRPs, Model Systems, Dissemination & Utilization Projects). Additional Source Information: Expert Review Panel. Frequency: Annually. Collection Period: 2005 - 2007 Data Available: November 2007 Indicator 8.2.2 of 6: By 2013, increase by at least 20% the number of new or improved tools and methods developed and/or tested with NIDRR funding that have been judged by expert panels to improve measurement and data collection procedures and enhance the design and evaluation of disability and rehabilitation interventions, products, and devices. Targets and Performance Data The number of new or improved tools and methods developed and/or tested with NIDRR funding that meet the stated criteria. (Outcome-Oriented Research & Development ) Year 2007 Actual Performance [blank] Performance Targets 999 Assessment of Progress Explanation: Approximately 1/3 of NIDRR's grants, based on a judgmentally selected sample of grantee nominated ''tools and methods'' will be reviewed. The first completed three-year cycle of portfolio assessments will include FY 2005, 2006, and 2007. Sources and Data Quality Source: Performance Report Grantee Performance Report: 1820-0642 Annual Performance Reporting Forms for NIDRR Grantees (RERCs, RRTCs, DBTACs, DRRPs, Model Systems, Dissemination & Utilization Projects). Additional Source Information: Expert Review Panel. Frequency: Annually. Collection Period: 2005 - 2007 Data Available: November 2007 Indicator 8.2.3 of 6: By 2013, increase by at least 20% the number of new and improved interventions, programs, and devices developed and/or tested with NIDRR funding that have been judged by expert panels to be successful in improving individual outcomes and increasing access. Targets and Performance Data The number of new and improved interventions, programs, and devices developed and/or tested with NIDRR funding that meet the stated criteria. (Outcome-Oriented Research & Development) Year 2007 Actual Performance [blank] Performance Targets 999 Assessment of Progress Explanation: Approximately 1/3 of NIDRR's grants, based on a judgmentally selected sample of grantee nominated ''interventions'' will be reviewed. The first completed three-year cycle of portfolio assessments will include FY 2005, 2006, and 2007. Sources and Data Quality Source: Performance Report Grantee Performance Report: 1820-0642 Annual Performance Reporting Forms for NIDRR Grantees (RERCs, RRTCs, DBTACs, DRRPs, Model Systems, Dissemination & Utilization Projects). Additional Source Information: Expert Review Panel. Collection Period: 2005 - 2007 Data Available: November 2007 Indicator 8.2.4 of 6: Percentage of grantee research and development that has appropriate study design, meets rigorous standards of scientific and/or engineering methods, and builds on and contributes to knowledge in the field. Targets and Performance Data The percentage of grantee research and development that has appropriate study design, meets rigorous standards of scientific and/or engineering methods, and builds on and contributes to knowledge in the field. (Activity-Oriented Research & Development) Year Actual Performance Performance Targets 2002 82 65 2003 96 70 2004 89 70 2005 999 2006 85 Assessment of Progress Explanation: The methodology for assessment of quality of funded projects changed in 2004. NIDRR no longer uses a second expert peer review of grantee research designs. The current measure is the ''Percentage of funded grant applications that received an average peer review score of 85 or higher.'' The data have been recalculated using the new methodology. Sources and Data Quality Source: Other Other: Peer Review. Date Sponsored: 05/27/2005. Frequency: Annually. Collection Period: 2004 - 2005 Data Available: November 2005 Indicator 8.2.5 of 6: Average number of publications per award based on NIDRR-funded research and development activities in refereed journals. Targets and Performance Data The number of publications per award meeting the stated criteria. (Output-Oriented Research & Development) Year Actual Performance Performance Targets 2002 2.74 2003 2.84 8 2004 5 2005 5 2006 2 Assessment of Progress Explanation: An accepted standard, such as the International Scientific Index (ISI) will be used. Data for publications will be collected over a calendar year, instead of fiscal year. Sources and Data Quality Source: Performance Report Grantee Performance Report: 1820-0642 Annual Performance Reporting Forms for NIDRR Grantees (RERCs, RRTCs, DBTACs, DRRPs, Model Systems, Dissemination & Utilization Projects). Frequency: Annually. Collection Period: 2004 Data Available: September 2005 Improvements: Publication data from four additional program funding mechanisms (DBTACs, DRRPs, FIPs, and KDU (Dissemination & Utilization) projects) will be included. Indicator 8.2.6 of 6: Percentage of new grants that include studies funded by NIDRR that assess the effectiveness of interventions, programs, and devices using rigorous and appropriate methods. (Output-Oriented Research & Development) Targets and Performance Data The percentage of new grants that meet the stated criteria. Year Actual Performance Performance Targets 2002 65 2003 59 2004 59 2006 999 Assessment of Progress [blank] Sources and Data Quality Additional Source Information: Review of grant applications. Frequency: Annually. Collection Period: 2005 - 2006 Data Available: November 2006 Objective 8.3 of 4: Advance knowledge through translation and dissemination: Promote the effective use of scientific-based knowledge, technologies, and applications to inform policy, improve practice, and enhance the lives of individuals with disabilities. Indicator 8.3.1 of 2: By 2013, increase by at least 20% the number of tools, methods, interventions, programs, and devices, developed and/or validated with NIDRR funding, that meet the standards for review by independent scientific collaborations and registries. Targets and Performance Data The number of NIDRR-funded tools, methods, interventions, programs, and devices that meet the stated criteria. (Output-Oriented Knowledge Translation) Year Actual Performance Performance Targets 2007 999 Assessment of Progress Explanation: Approximately 1/3 of NIDRR's grants, based on a judgmentally selected sample of published discoveries, tools, methods, interventions, programs, and devices. The first completed three-year cycle of portfolio assessments will include FY 2005, 2006, and 2007. Sources and Data Quality Source: Performance Report Grantee Performance Report: 1820-0642 Annual Performance Reporting Forms for NIDRR Grantees (RERCs, RRTCs, DBTACs, DRRPs, Model Systems, Dissemination & Utilization Projects). Additional Source Information: Expert Panel Review Frequency: Annually. Collection Period: 2005 - 2007 Data Available: November 2007 Indicator 8.3.2 of 2: Number of new or improved assistive and universally-designed technologies, products, and devices developed and/or validated by grantees that are transferred to industry for potential commercialization. Targets and Performance Data The number of new or improved assistive and universally designed technologies, products, and devices. (Outcome-Oriented Knowledge Translation) Year Actual Performance Performance Targets 2005 999 2006 999 Assessment of Progress [blank] Sources and Data Quality Source: Performance Report Grantee Performance Report: 1820-0642 Annual Performance Reporting Forms for NIDRR Grantees (RERCs, RRTCs, DBTACs, DRRPs, Model Systems, Dissemination & Utilization Projects). Frequency: Annually. Collection Period: 2004 - 2005 Data Available: April 2006 Objective 8.4 of 4: Enhance efficiency of NIDRR grant award process. Indicator 8.4.1 of 1: Notification: Notification of applicants. Targets and Performance Data The percentage of competitions announced by Oct. 1. Year Actual Performance Performance Targets 2003 21 2004 23 2005 8 2006 999 The percentage of grant awards issued within 6 months of the competition closing date. Year Actual Performance Performance Targets 2003 70 2004 83 2006 999 Assessment of Progress [blank] Sources and Data Quality Additional Source Information: GAPS and Federal Register Notice. Frequency: Annually. Collection Period: 2005 - 2006 Data Available: October 2006 Source: 2006PM 08/01/2005 01:26 PM APPENDIX B: PANEL MEMBERS Panelists Robert L. Baldwin, Ph.D. Licensed Psychologist University of Colorado 1793-2 Quentin Street Aurora, CO 80010 Phone 720-859-0767 Fax 720-589-3113 E-mail robert.baldwin@uchsc.edu M. Nell Bailey Project Director RESNA 1700 N. Moore Street Suite 1540 Arlington, VA 22209-1903 Phone 703-524-6686 x 305 Fax 703-524-6630 TTY 703-524-6639 E-mail nbailey@resna.org Patrick E. Crago, Ph.D. Allen H. and Constance T. Ford Professor and Chairman Department of Biomedical Engineering Case Western Reserve University Wickenden Building Room 520 10900 Euclid Avenue Cleveland, OH 44106-7207 Phone 216-368-3977 Fax 216-368-4969 E-mail pec3@case.edu David Dikter Executive Director Assistive Technology Industry Association 1600 Beacon Street # 301 Brookline, MA 02446 Phone 617-524-0035 Fax 617-739-0330 E-mail ddikter@atia.org Corinne Kirchner, Ph.D. Director, Policy Research & Program Evaluation American Foundation for the Blind 11 Pennsylvania Plaza Suite 300 New York, NY 10001-2018 Phone 212-502-7640 Fax 212-402-7773 E-mail Corinne@afb.net Kathleen Laurin, Ph.D. Project Director University of Montana Rural Institute 634 Eddy Avenue CHC-009 Missoula, MT 59812 Phone 406-243-5769 Fax 406-243-4730 TTY 406-243-5769 E-mail klaurin@ruralinstitute.umt.edu Phoebe Liebig, Ph.D. Associate Professor Andrus Gerontology Center University of Southern California 3715 McClintock Avenue Building 228 Los Angeles, CA 90089-0191 Phone 213-740-1719 (M, W & Th) 310-202-9187 (T & F) Fax 213-740-7069 E-mail liebig@usc.edu Michael Lightner, Ph.D. University of Colorado at Boulder College of Engineering, Research East 3100 Marine Street 570UCB Boulder, CO 80309-0570 Phone 303-492-5180 Fax 303-492-5180 E-mail lightner@boulder.colorado.edu Rita M. Patterson, Ph.D. Associate Professor University of Texas Medical Branch 301 University Boulevard Department of Orthopedics & Rehabilitation Galveston, TX 77555-0174 Phone 409-747-3245 Fax 409-747-3240 E-mail rita.patterson@utmb.edu Grace Peng, Ph.D. Program Director Department of Health and Human Services National Institutes of Health National Institute for Biomedical Imaging and Bioengineering Democracy Boulevard Suite 200 Bethesda, MD 20892 Phone 301-496-9178 Fax 301-480-0679 E-mail grace.peng@nih.hhs.gov Cyndi Rowland, Ph.D. Associate Professor Center for Persons with Disabilities 6801 Old Main Hill Logan, UT 84322-6801 Phone 435-797-3381 Fax 435-797-3944 E-mail Cyndi@cpd2.usu.edu Mark S. Sothmann, Ph.D. Dean Indiana University School of Health & Rehabilitation Sciences Coleman Hall 120 1140 West Michigan Street Indianapolis, IN 46202-5119 Phone 317-274-4702 Fax 317-274-4723 E-mail msothman@iupui.edu Elizabeth T. Spiers Director of Programs/Services American Association of the Deaf-Blind 8630 Fenton Street Suite 121 Silver Spring, MD 20910 Phone 301-495-4403 Fax 301-495-4404 TTY 301-495-4402 E-mail espiers@aadb.org Gwo-Wei Torng, Ph.D. Lead Transportation Engineer Mitretek Systems 600 Maryland Avenue SW Suite 755 Washington, DC 20024 Phone 202-488-5714 Fax 202-863-2988 E-mail gwo-wei.torng@mitretek.org Howard D. Wactlar, Ph.D. Alumni Research Professor of Computer Science Vice Provost Carnegie Mellon University School of Computer Science 5000 Forbes Avenue Wean Hall 5216 Pittsburg, PA 15213 Phone 412-268-2571 Fax 412-268-7458 E-mail wactlar@cmu.edu Gale Whiteneck, Ph.D. Director of Research Craig Hospital 3425 South Clarkson Street Englewood, CO 80113 Phone 303-789-8204 Fax 303-789-8441 E-mail gale@craig-hospital.org Thomas Wlodkowski Director of Accessibility America Online 22000 AOL Way Dulles, VA 20165 Phone 703-265-1999 Fax 703-265-4786 E-mail tomwlodkowski@aol.com Facilitators Meeting Facilitator: Duncan Moore, Ph.D. Professor of Optical Engineering University of Rochester P.O. Box 270186 Rochester, NY 14627-0186 Phone 585-275-5248 or 585-275-8828 Fax 585-244-4936 E-mail moore@optics.rochester.edu nancyg@seas.rochester.edu Process Facilitator: Susan Daniels, Ph.D. Daniels & Associates 3001 Veazey Terrace, NW #633 Washington, DC 20008-5413 Phone 202-363-8970 Fax 202-363-0145 E-mail SMDaniels@earthlink.net NIDRR Staff Steven J. Tingus, M.S., C.Phil. Director NIDRR U.S. Department of Education Room 6056 400 Maryland Avenue, SW Washington, DC 20202-2700 Phone 202-245-7549 Fax 202-245-7643 or 202-245-7323 E-mail Steven.tingus@ed.gov Kelly E. King, M.D. Deputy Director NIDRR U.S. Department of Education Room 6059 400 Maryland Avenue, SW Washington, DC 20202-2700 Phone 202-245-7639 Fax 202-245-7643 or 202-245-7323 E-mail Kelly.king@ed.gov Phillip Beatty, Ph.D. NIDRR U.S. Department of Education Room 6073 400 Maryland Avenue, SW Washington, DC 20202-2700 Phone 202-245-7267 Fax 202-245-7643 or 202-245-7323 E-mail Phillip.beatty@ed.gov Ruth Brannon, M.S.P.H., M.A. Associate Director Research Sciences Division NIDRR U.S. Department of Education Room 6054 400 Maryland Avenue, SW Washington, DC 20202-2700 Phone 202-245-7278 Fax 202-245-7643 or 202-245-7323 E-mail ruth.brannon@.ed.gov Carol G. Cohen NIDRR U.S. Department of Education Room 6035 400 Maryland Avenue, SW Washington, DC 20202-2700 Phone 202-245-7303 Fax 202-245-7643 or 202-245-7323 E-mail Carol.cohen@ed.gov Thomas Corfman, Ph.D. NIDRR U.S. Department of Education Room 6065 400 Maryland Avenue, SW Washington, DC 20202-2700 Phone 202-245-7306 Fax 202-245-7643 or 202-245-7323 E-mail Thomas.corfman@ed.gov Richard P. Melia, Ph.D. Director of Research and Science Division NIDRR U.S. Department of Education Room 6053 400 Maryland Avenue, SW Washington, DC 20202-2700 Phone 202-245-7446 Fax 202-245-7643 or 202-245-7323 E-mail Richard.melia@ed.gov Shelley Reeves, M.S. NIDRR U.S. Department of Education Room 6031 400 Maryland Avenue, SW Washington, DC 20202-2700 Phone 202-245-7486 Fax 202-245-7643 or 202-245-7323 E-mail Shelley.reeves@ed.gov William Schutz, Ph.D. NIDRR U.S. Department of Education Room 6063 400 Maryland Avenue, SW Washington, DC 20202-2700 Phone 202-245-7519 Fax 202-245-7643 or 202-245-7323 E-mail William.schutz@ed.gov Arthur Sherwood, Ph.D. Science and Technology Advisor NIDRR U.S. Department of Education Room 6058 400 Maryland Avenue, SW Washington, DC 20202-2700 Phone 202-245-7522 Fax 202-245-7643 or 202-245-7323 E-mail Arthur.sherwood@ed.gov Office of Management and Budget (OMB) Staff Lin Liu Program Examiner Office of Management and Budget 725 17th Street NW Washington, DC 20503 Phone 202-395-3308 E-mail Lin_liu@omb.eop.gov U.S. Department of Education (DOE) Staff Judith Anderson Budget Service U.S. Department of Education 400 Maryland Avenue, SW Washington, DC 20202 Phone 202-401-3944 Fax 202-401-0220 E-mail Judith.anderson@ed.gov New Editions Staff Sheila Newman President New Editions 6858 Old Dominion Drive Suite 230 McLean, VA 22101 Phone 703-356-8035 Fax 703-356-8314 Email snewman@neweditions.net Christine Mason, Ph.D. Senior Research Scientist New Editions 6858 Old Dominion Drive Suite 230 McLean, VA 22101 Phone 703-356-8035 Fax 703-356-8314 Email cmason@neweditions.net Barbara Rosen Program Coordinator New Editions 6858 Old Dominion Drive Suite 230 McLean, VA 22101 Phone 703-356-8035 Fax 703-356-8314 Email: brosen@neweditions.net Eldri Ferguson Conference Manager New Editions 6858 Old Dominion Drive Suite 230 McLean, VA 22101 Phone 703-356-8035 Fax 703-356-8314 Email eferguson@neweditions.net APPENDIX C: AGENDA Agenda Annual Portfolio Assessment Expert Review Technology Portfolio Pilot National Institute on Disability and Rehabilitation Research October 5 and 6, 2005 The Sheraton National Hotel 900 South Orme Street Arlington, VA 22204 Wednesday, October 5, 2005 8:15 a.m. Continental Breakfast 9:00 a.m. Greetings and Introduction Duncan Moore, Ph.D. Facilitator 9:15 a.m. NIDRR WELCOME Ste Steven James Tingus, M.S., C.Phil. Director of NIDRR 9 :30 a.m. ORIENTATION Overview of NIDRR Mission, Program Funding Mechanisms, & Logic Model Richard Melia, Ph.D. Director of Research and Science Division 9:45 a.m. Overview of Technology Portfolio and Research Directions: LRPs 1999-2003 vs. 2005-2010 Ruth Brannon, M.S.P.H., M.A. Associate Director Research Sciences Division 10:00 a.m. The What & Why of APAER: Relationship to PART & NIDRR Logic Model Margaret Campbell, Ph.D. Coordinator of Evaluation 10:15 a.m. APAER ROLES & PROCEDURES and Q&A Margaret Campbell & Duncan Moore 10:45 a.m. Break 11:00 a.m. WORKING GROUP - CLUSTERS * Start process of reviewing awards & accomplishment nuggets and complete individual level ratings Cluster Facilitators & NIDRR Staff Liaisons 12:30 p.m. Lunch (on your own) 1:30 p.m. WORKING GROUP - Clusters, Cont. * Finish reviewing awards & accomplishment nuggets and complete individual level ratings * Submit individual score sheets to contractor for aggregation by 3:30 p.m. Cluster Facilitators & NIDRR Staff Liaisons 3:30 p.m. WORKING GROUP - Clusters, Cont. * Summative Discussion of Cluster Performance (i.e., relevance, quality, productivity, & significance of topics and accomplishments) Cluster Facilitators & NIDRR Staff Liaisons 5:00 p.m. ADJOURN Annual Portfolio Assessment Expert Review Technology Portfolio Pilot Thursday, October 6, 2005 7:30 a.m. Continental Breakfast 8:00 a.m. WELCOME Kelly E. King, M.D. Deputy Director 8:15 a.m. OVERVIEW OF DAY 2 Duncan Moore, Ph.D. Facilitator 8:30 a.m. PORTFOLIO CLUSTER PRESENTATIONS Cluster Presenters 10:30 a.m. PANEL DISCUSSION CORE QUESTIONS - Report Preparation "2 x 2" Exercise Duncan Moore, Ph.D. Susan Daniels, Ph.D. Co-Facilitators Susan Daniels 12:15 p.m. Working Lunch & Q&A with NIDRR Senior Management "Impressions" Kelly E. King, M.D. Deputy Director Arthur M. Sherwood, Ph.D. Science and Technology Advisor Duncan Moore 1:30 p.m. REVIEW OF PORTFOLIO ASSESSMENT from Morning Session Duncan Moore & Susan Daniels 2:00 p.m. Identification of Key Weakness in Portfolio and Recommendations for Improvement (NIDRR management & Grantee performance) Duncan Moore & Susan Daniels 3:30 p.m. WRAP-UP AND EVALUATION* Duncan Moore, Facilitator and New Editions Project Staff 4:00 p.m. ADJOURN * Note no NIDRR staff will be present for this session. APPENDIX D: ACCOMPLISHMENT NUGGETS BY TECHNOLOGY CLUSTERS Prior to the cluster level meeting, each panelist had identified up to two "best" nuggets each for short term outcomes, intermediate level outcomes, and scientific accomplishments. At the cluster level meetings, the panelists decided which of these nuggets merited joint review. A complete list of accomplishment nuggets identified and reviewed by each cluster is presented below. It is important to note that the review of the individual grantee reports was intended to assist panelists in judging the overall quality, relevance, and performance results of NIDRR's entire Technology portfolio under the APAER process, rather than the performance of individual grantees. Program Mechanism & Grant Title Title of Accomplishment Type of Accomplishment Comments Outputs Outcomes Sensory/Communication Cluster 1) Grantee A: RERC 1.1Testing FM Microphones Publication Potential non-acceptance of a technology that is successful 1.2 Multidisciplinary Team Very well balanced, good diversity 2) Grantee B: RERC 2.1 Language Organization techniques for children Publication 1 person also ranked this as an output 2.2 Persons with ALS patterns of acceptance of high tech devices Publication NC 2.3 Multidisciplinary team NA2 NA Well balanced, high technical expertise 3) Grantee C: RERC 3.1 National Research Council Book Intermediate Policy Book is cited in Federal Register notice of rule making 3.2 Accessible Building Entry System Intermediate Device Non-academic product.On Market, but PR refers to Trace RERC and not this RERC 4) Grantee D: DRRP 4.1 Multidisciplinary Team NA2 NA Needs more technical expertise; needs more social science expertise for survey & focus group 4.2 e-book accessibility comparison table X Tool/method non-academic Publication Not published in peer-reviewed platform, major organizations are involved, probably leading to systematic impact 5) Grantee E: FIP 5.1 MAGpie X Publication/tool Given the freeware nature of this tool, it is difficult to determine the exact impact on ebook accessibility 6) Grantee F: FIP 6.1 Half-toning for tactile imaging Short-term knowledge 0 - Lack of expertise of reviewer; 1 - unable to open document 7) Grantee G: FIP 7.1 Math Genie Short-term tool Difficult to score; download with risk, purchasing article required; 2- potential significance, important topic Mobility/Manipulation Cluster 1) Grantee H: RERC 1.1 Socket Fabrication System Practice Great knowledge translation conferences in Mexico & Vietnam 1.2 Sound casting publication Prosthetics & Orthotics Publication 1.3 Alignment System Publication 2) Grantee I: RERC 2.1 Review articles BNE Publication No new knowledge - consolidation of knowledge 2.2 3D VR technology Publication Has outstanding potential for wide range of applications 2.3 Short course for physical therapists Program Not a solicited program (yet) 2.4 Cable-powered clove Device Has outstanding potential 3) Grantee J: RERC 3.1 Shoulder strengthening Publication Invited review article 3.1 Shoulder function during transfers Practice Guidelines published in clinical practices guidelines 4) Grantee K: RERC 4.1 Temporal Symmetries gait SRRD (2005) Publication Creation of new mathematical tool for use in gait analysis 4.2 Ankle stiffness during walking Publication 20 in study; less ambitious, less likely to lead to a comprehensive ankle prosthesis 5) Grantee L: FIP 5.1 Publication in Experimental Brain Research (2004) Publication Is used by the investigator to design new therapies being testing in this FIP 6) Grantee M: FIP 6.1 Residual Limb circulation Publication Translation of knowledge is unclear, need more information IT/Telecommunications Access Cluster 1) Grantee N: RERC 1.1 Gesture panel Device Prototype transferred to industry 1.2 Gesture toolkit Tool Don't know how widely used 1.3 Activity monitoring/ weight shift monitoring Device Patent application 2) Grantee O: RERC 2.1 Comments to FCC I-Enabled Service Policy Document to FCC, FCC comments, 5 position reflect outcome 2.2 Access to VoIP Publication Don't know impact 3) Grantee P: RERC 3.1 Book Chapter Publication Price of encyclopedia high; circulation unknown, NIDRR not acknowledged 3.2 EI Access Kiosk Device Placement in postal systems 4) Grantee Q: DRRP 4.1 Learning & inferring Transportation Publication 2005, no NIDRR acknowledgmt, can't determine, dismissed because of 2005 date 4.2 Mobile context inference Publication 2005, no NIDRR acknowledgmt, can't determine, dismissed because of 2005 date 5) Grantee R: FIP 5.1 DTVCC test materials Product Standard closed captioning reference for industry 5.2 Implementation Guidelines Method Important for industry SMPTE standard 6) Grantee S: FIP 6.1 Commercialization of Wizard Tool 1000 licenses sold. Output = wizard = product 6.2 Proceedings Publication No indication of quality 7) Grantee T: FIP 7.1 Published draft specification Matrix Publication In working groups of SMPTE for standards 7.2 NJ mandated theaters to have more accessible technology; MOPIX ok'ed Policy Policy 7.3Product - glasses for captioning Device Outcome Environmental Access Cluster 1) Grantee U: RERC 1.1 Force plate design transfer Publication Need more explanation. Appeared just application of standard procedures without innovation 1.2 Force plate Product May have great future implication 1.3 Docket prepared for lavatory design Change in practice "Alleged outcome," lack of supporting evidence 2) Grantee V: RERC 2.1 Published national survey results of difficulties with medical equipment Publication 2.2 MU-Lab Measurement Improve measurement & strengthening design and evaluation 2.3 External use survey results for facility site review audits in California Policy and practice Improved policy and practice 3) Grantee W: RERC 3.1 Workplace accommodations literature review Publication 3.2 Presentation at the technology and persons with disability conference Presentation May also improve policy, but lack of evidence for follow-up input and feedback 3.3 National presentations to diverse audiences Presentation Unable to measure direct impacts due to lack of information 4) Grantee X: RERC 4.1 Wheelchair occupancy restraint systems issues Publication 2005 publication, "alleged" to provide guidance to industry, no evidence, need to acknowledge NIDRR 4.2 Published method to improve wheelchair restraint usability Publication 4.3 Power wheelchair code Standards Lack of supporting information/evidence 4.4 Ride-safe, step by step brochure/web site Guidelines Need more knowledge transfer efforts 5) Grantee Y: RERC 5.1 A book on Smart Technology for Aging, Disability, and Independence Book Published by John Wiley & Sons, Inc. 5.2 LAMP (Low ADL monitoring program) Improved measurements Based on randomized clinical trials 5.3 Assistive technology service delivery policy Improve policy 5.4 Six "smart home" related patents Device 6) Grantee Z: RERC 6.1 A journal special issue Publication Some technical reports are important, although dated 2005 6.2 Visitability initiative Initiative Would like to obtain supporting evidence for outcomes 6.3 Prototype database Improved policy Adopted by the AccessBoard with impacts on Accessibility Codes 6.4 Universal Design New York Change in Practice Model has been adopted elsewhere Cross-Cutting Issues Cluster 1) Grantee AA: RERC 1.1 Commercialized products available to consumers Intermediate outcome, device system capacity Increases supply of available AT products to consumers. No market data to evaluate overall impact in the marker. Nugget represents collection of devices on market, rather than each product representing a nugget (lids off, point smart mouse, Kelvin Thermostat, Earmolds) 1.2 Industry profile on visual impairments Publication Evidence of document and comprehensive coverage of topic 1.3 Information Technology for Independence: Community-based Research 1.4 Web accessibility of Health Care web sites Publication 1.5 WAB Measurement Tool Product Product is a measurement tool for web site accessibility. WAB is a tool that has been cited by others. Some degree of interest in future use. Difficult to determine if this is an outcome or output 2) Grantee BB: DRRP 2.1Special Issue Journal Publication Unclear from articles if they have had any significant reach. How wide-pread they have been read is also unclear 3) Grantee CC: DRRP 3.1 AT outcomes measures web site Product Only provided introductory information of each measure 3.2 AT effectiveness measure Tool Inadequate information for any further assessment 3.3 AT in the Community Service 4) Grantee DD: DRRP 4.1 Increased knowledge of AT by consumers Service No data to back up how this intervention for increasing AT is supported. 4.2 AT survey of consumers Findings Research appears to be in progress. Project has participants but no clear findings to date. Is this an assessment of a survey for research? It is difficult to understand which activities are NIDRR related and which are program specific funded through other sources. 5) Grantee EE: FIP 5.1 Develop medicare/DME tracking system Service Developed tracking system that allowed movement to implementation 5.2 Redistribution of DME Program 611 pieces of equipment recovered and 393 re-assigned 5.3 Establish model for DME redistribution Capacity-building Signed contact with state. Other states interested in replication. 6) Grantee FF: FIP 6.1 System requirements for future Web-based AT outcomes database Process not completed, therefore not enough information to date 1Note that teams varied in their approach to rating publications, devices, and tools and in general, did not rate an accomplishment nugget as an outcome unless there was evidence of its impact. 2Multidisciplinary teams were identified in instructions on the scoring system provided as another type of nugget; they are not rated as either an outcome or output. APPENDIX E: PARTICIPANT EVALUATION SUMMARY Participant Evaluation Summary Annual Portfolio Assessment Expert Review (APAER) 2005 Technology Portfolio Pilot October 5-6, 2005 Evaluation Results3 Design Expert Reviewer Response 1 2 3 4 5 6 7 8 9 10 11 12 13 Average Instructed to consider the potential of the APAER process to..... using a scale of 1-3, 3 being very well designed Assess the quality and relevance of NIDRR-funding research 2 2 2 1 1 2 3 1 2 1 2 2 1 1.69 Identify the extent to which outputs and outcomes are contributing to the agency's long-term performance measures and strategic goals 2 2 2 2 2 2 3 1 2 2 2 2 1 1.92 Provide NIDRR with an assessment of the agency's management of research directions and award decisions 2 1 1 1 1 2 2 2 2 2 2 2.5 2 1.73 Total 1.78 General Expert Reviewer Response Rated on a scale 1-5, 5 being strongly agree The concept of NIDRR's portfolio assessment is sound 2 2 4 4 5 5 4 4 5 5 1 5 2 3.69 The reviewers were well qualified and presented a good balance of expertise 5 4 5 4 3 5 4 5 5 5 5 5 5 4.62 The background materials were useful 2 3 2 1 4 2 1 5 5 3 2 4 2 2.77 The on-site materials were helpful 3 3 3 2 2 2 3 2 3 3 2 2 2 2.46 The Web site was useful 3 2 1 1 n/a 3 1 5 3 n/a 1 2 3 2.27 Day 1, overview by NIDRR staff was helpful 4 4 4 4 5 4 4 4 5 4 4 5 3 4.15 Day 1, discussion in Cluster Groups was effective 4 5 4 5 4 5 4 4 5 5 5 4 4 4.46 Day 2, whole group APAER Discussion & Scoring was effective 4 5 4 4 4 2 4 2 3 5 5 2 4 3.69 The discussion of recommendation to NIDRR was important and useful 4 5 4 2 5 5 3 4 5 5 4 4 4 4.15 The meeting facilitators were effective 3 5 4 3 5 5 4 4 3 5 5 4 4 4.15 The support provided by staff on site was helpful 3 5 4 4 5 5 3 5 5 4 5 4 4 4.31 The hotel and meeting space, including accessibility, met your expectations 3 4 4 4 5 5 3 5 5 4 5 2 4 4.08 The food and beverage services were excellent 3 3 4 4 5 5 3 4 5 5 5 4 3 4.08 Average 3.76 Written Comments What are the strengths of the APAER process: Gives NIDRR an excellent point of reference for evaluating projects and programs effectiveness. Gathering nuggets, gaining overview; Gathering expert opinion / reaction Diversity of input into the process. NIDRR staff involved and assistive The concept model is elegant Help identify strength & weaknesses of NIDRR's process and work. I like the diversity of the people on the panel, engineers, social scientists, consumers etc. The potential for what can be done once changes to the pilot are addressed Excellent opportunity for experts in the field to take a good look at the quality of reviews being contributed Brought together a very-well balanced group of experts that covered the span of NIDRR disciplines The concept is great, and the result would be great, should "field data" be collected consistently and accurately External, unbiased viewpoint; getting more people aware of NIDRR Excellent beginning- consider this a learning experience with some baseline data and info about what you need to do next in this process Focus on outcomes What are the weaknesses of the APAER process: The methodology appears flawed. There is a heavy reliance on opinion and not a systematic approach that makes APAER as objective as possible Not enough information from project report. No data on priorities at meeting Very unclear, confusing process. Need initial orientation via phone before gathering people together. More info. is needed & should be provided rather than relying on reviewers to get it on their own The operational steps in applying the intended process are extremely poorly thought-out, Lack of sufficient information on individual grants This is a raw process and the kinks need to be worked out. I just hope that the comments for improvement are taken to heart. Participant did not have enough knowledge of the NIDRR portfolio and history before coming to Washington When the collection forms were inconsistently filled out by the grantees, the panel made all the comments & recommendations based on such, which may or may not be relevant and useful for NIDRR External viewpoints that do not encompass the entire picture should be an internal review within NIDRR Just getting verbal feed back on all clusters wasn't enough to be able to answer the portfolio questions Lack of effective communication of outcomes Other Comments: Keep it up, this will make an important difference Unclear expectation of the reviewers Make things less bureaucratic, less paper/... I hope you will give us feedback from both Technology and Technology reviews; did things have the same/similar comments? The time frame was very compressed - no real chance to do a thorough review. Also don't expect us to download 100s of pages, also shorten the amount of background info Simplify 1 PART is a systematic method of assessing and improving program performance across the Federal government, instituted by the Office of Management and Budget (OMB). The APAER was also designed to satisfy specific recommendations from NIDRR's 2003 PART review. 2 Note: no RRTCs were in the Technology portfolio. 3 Total number of responses = 13 ?? ?? ?? ?? NIDRR Annual Portfolio Assessment Expert Review Pilot: Technology Portfolio Panel Summary Report 5 NIDRR Annual Portfolio Assessment Expert Review Pilot: Technology Portfolio Panel Summary Report 60