Archive: intellectual merit

Newsletter: Revisiting Intellectual Merit and Broader Impact

Posted on January 1, 2016 by  in Newsletter - () ()

Director of Research, The Evaluation Center at Western Michigan University

If you have ever written a proposal to the National Science Foundation (NSF) or participated in a proposal review panel for NSF, you probably instantly recognize the terms Intellectual Merit and Broader Impacts as NSF’s merit review criteria. Proposals are rated and funding decisions are made based on how well they address these criteria. Therefore, proposers must describe the potential of their proposed work to advance knowledge and understanding (Intellectual Merit) and benefit society (Broader Impacts).

Like cramming for an exam and then forgetting 90 percent of what you memorized, it’s all too easy for principal investigators to lose sight of Intellectual Merit and Broader Impacts after proposal submission. But there are two important reasons to maintain focus on Intellectual Merit and Broader Impacts after an award is made and throughout project implementation.

First, the goals and activities expressed in a proposal are commitments about how a particular project will advance knowledge (Intellectual Merit) and bring tangible benefits to individuals, institutions, communities, and/or our nation (Broader Impacts). Simply put, PIs have an ethical obligation to follow through on these commitments to the best of their abilities.

Second, when funded PIs seek subsequent grants from NSF, they must describe the results of their prior NSF funding in terms of Intellectual Merit and Broader Impacts. In other words, proposers must explain how they used their NSF funding to actually advance knowledge and understanding and benefit society. PIs who have evidence of their accomplishments in these areas and can convey it succinctly will be well-positioned to seek additional funding. To ensure evidence of both Intellectual Merit and Broader Impacts are being captured, PIs should revisit project evaluation plans with their evaluators, crosschecking the proposal’s claims about potential Intellectual Merit and Broader Impacts in relation to the evaluation questions and data collection plan to make sure compelling evidence is captured.

Last October, I conducted a workshop on this topic at the ATE Principal Investigators Conference with colleague Kirk Knestis, an evaluator from Hezel Associates. Dr. Celeste Carter, ATE program co-lead, spoke about how to frame results of prior NSF support in proposals. She noted that a common misstep that she has seen in proposals is when proposers speak to results from prior support by simply reiterating what they said they were going to do in their funded proposals, rather than describing the actual outcomes of the grant. Project summaries (one-page descriptions that address a proposed project’s Intellectual Merit and Broader Impacts that are required as part of all NSF proposals) are necessarily written in a prospective, future-oriented manner because the work hasn’t been initiated yet. In contrast, the Results of Prior NSF Support sections are about completed work and therefore are written in past tense and should include evidence of accomplishments. Describing achievements and presenting evidence of the quality and impact of those achievements shows reviewers that the proposer is a responsible steward of federal funds, can deliver on promises, and is building on prior success.

Take time now, well before it is time to submit a new proposal or a Project Outcomes Report, to make sure you haven’t lost sight of the Intellectual Merit and Broader Impact aspects of your grant and how you promised to contribute to these national priorities.

Newsletter: How can PIs demonstrate that their projects have “advanced knowledge”?

Posted on January 1, 2016 by  in Newsletter - () ()

Director of Research, The Evaluation Center at Western Michigan University

NSF’s Intellectual Merit criterion is about advancing knowledge and understanding within a given field or across fields. Publication in peer-reviewed journals provides strong evidence of the Intellectual Merit of completed work. It is an indication that the information generated by a project is important and novel. The peer review process ensures that articles meet a journal’s standard of quality, as determined by a panel of reviewers who are subject matter experts.

In addition, publishing in an academic journal is the best way of ensuring that the new knowledge you have generated is available to others, becomes part of a shared scientific knowledge base, and is sustained over time. Websites and digital libraries tend to come and go with staff and funding changes. Journals are archived by libraries worldwide and, importantly, indexed to enable searches using standard search terms and logic. Even if a journal is discontinued, its articles remain available through libraries. Conference presentations are important dissemination vehicles, but don’t have the staying power of publishing. Some conferences publish presented papers in conference proceedings documents, which helps with long-term accessibility of information presented at these events.

The peer review process that journals employ to determine if they should publish a given manuscript is essentially an evaluative process. A small group of reviewers assesses the manuscript against criteria established for the journal. If the manuscript is accepted for publication, it met the specified quality threshold. Therefore, it is not necessary for the quality of published articles produced by ATE projects to be separately evaluated as part of the project’s external evaluation. However, it may be worthwhile to investigate the influence of published works, such as through citation analysis (i.e., determination of the impact of a published article based on the number of times it has been cited—to learn more, see http://bit.ly/cit-an).

Journals focused on two-year colleges and technical education are good outlets for ATE-related publications. Examples include Community College Enterprise, Community College Research Journal, Community College Review, Journal of Applied Research in the Community College, New Directions for Community Colleges, Career and Technical Education Research, Journal of Career and Technical Education, and Journal of Education and Work. (For more options, see the list of journals maintained by the Center of Education and Work (CEW) at the University of Wisconsin at http://bit.ly/cew-journals.)

NSF’s Intellectual Merit criterion is about contributing to collective knowledge. For example, if a project develops embedded math modules for inclusion in an electrical engineering e-book, students may improve their understanding of math concepts and how they relate to a technical task—and that is certainly important given the goals of the ATE program. However, if the project does not share what was learned about developing, implementing, and evaluating such modules and present evidence of their effectiveness so that others may learn from and build on those advances, the project hasn’t advanced disciplinary knowledge and understanding.

If you are interested in preparing a journal manuscript to disseminate knowledge generated by your project, first look at the type of articles that are being published in your field (check out CEW’s list of journals referenced above). You will get an idea of what is involved and how the articles are typically structured. Publishing can become an important part of a PI’s professional development, as well as a project’s overall effort to disseminate results and advance knowledge.

Blog: Intellectual Merit and Broader Impacts: Identifying Your Project’s Achievements and Supporting Evidence

Posted on October 21, 2015 by  in Blog ()

Director of Research, The Evaluation Center at Western Michigan University

Creative Commons License This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.

Updated April 2019 for Throwback Thursday series.

The deadline for proposals to the National Science Foundation’s Advanced Technological Education program for this year just passed, so a blog about what to include in the Results from Prior NSF Support section in an ATE proposal may seem untimely. But if you’re involved in a project that will seek additional funding from NSF in the next year or two, there’s no better time than NOW to assess the quality and quantity of the evidence of your current project’s intellectual merit and broader impacts (NSF’s review criteria for proposed and completed work).

Understand the fundamentals of intellectual merit and broader impacts: In a nutshell, intellectual merit is about advancing knowledge and understanding. Broader impacts are benefits to society. If your project is mainly research, it’s likely that most of your achievements are related to intellectual merit. If you’re mostly doing development and implementation, your achievements are probably more in the area of broader impacts. But you should have evidence of both aspects of your work.

Identify your project’s intellectual merit and broader impacts: To hone in on your project’s intellectual merit and broader impacts, it helps to break down these big ideas into smaller chunks. To identify your project’s intellectual merit, ask yourself, what are we doing that is generating new knowledge or improved understanding? Are we using novel research methods or investigating a novel topic to better understand an aspect of STEM education? Is our project transformative, bringing about extraordinary or revolutionary change? In terms of broader impacts, what are we doing to serve groups that have been historically underrepresented in STEM; developing a diverse workforce; creating partnerships between academia and industry; enhancing education infrastructure; increasing economic competitiveness; or improving STEM education in general?

Identify gaps in evidence: It’s not enough to profess your achievements—you need evidence. Evidence is not the method you used to collect data (tests, surveys, observations, etc.); it’s the evidence indicated by those data (a genetic test is not evidence that someone committed a crime, the result of that test is the evidence). If you don’t have good evidence of important achievements, revise your evaluation plan and start collecting data as soon as possible. Make sure that you have evidence of more than just the completion of activities. For example, if your achievement is that you developed a new certification program, to demonstrate broader impacts, you need evidence that it is a high-quality program and that students are enrolling, graduating, and getting jobs (or at least go as far down the outcomes chain as reasonable). Plotting your evidence on a logic model is a good way to figure out if you have sufficient evidence regarding outcomes as well as activities and outputs.

If you find gaps that will impair your ability to make a compelling case about what you’ve achieved with your current grant, update your evaluation plan accordingly. When you write your next proposal, you will be required to present evidence of your achievements under the specific headings of “Intellectual Merit” and “Broader Impacts” – if you don’t, your proposal is at risk of being returned without review.

 

To learn more, check out these resources:

NSF Grant Proposal Guide (this link goes directly to the section on Results from Prior NSF Support): http://bit.ly/nsf-results

NSF Merit Review Website: http://www.nsf.gov/bfa/dias/policy/merit_review/

NSF Important Notice #130: Transformative Research (for details about what NSF considers transformative, one dimension of intellectual merit): http://www.nsf.gov/pubs/2007/in130/in130.jsp

NSF Examples of Broader Impacts: http://www.nsf.gov/pubs/2002/nsf022/bicexamples.pdf

Perspectives on Broader Impacts: http://www.nsf.gov/od/oia/publications/Broader_Impacts.pdf

National Alliance for Broader Impacts: http://broaderimpacts.net/about/

Materials from our ATE PI conference workshop on this topic, including presentation slides, worksheets, and EvaluATE’s Results from Prior NSF Support Checklist: http://www.evalu-ate.org/library/conference/pi-conference/