Deborah Douma

Dean of Grants and Federal Programs, Pensacola State College

Dr. Deborah Douma, Dean of Grants and Federal Programs at Pensacola State College in Florida, has an AA from Irvine Valley College, a BA in communication arts and MS in administration from the University of West Florida, and an EdD in higher education administration from the University of Florida. Dr. Douma’s research focused on factors leading to engagement of community college faculty in grant-writing activities. She serves on the Florida Association of Colleges Foundation board and locally on boards of the EscaRosa Coalition on the Homeless, the Escambia County 4-H Foundation, and the First City Art Alliance.


Blog: Utilizing Your Institutional Research Office Resources When Writing a Grant Application

Posted on March 20, 2018 by , in Blog ()
Creative Commons License This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.
Deborah Douma
Dean, Grants and Federal Programs, Pensacola State College
Michael Johnston
Director of Institutional Research, Pensacola State College

There are a number of guiding questions that must be answered to develop a successful grant project evaluation plan. The answers to these questions also provide guidance to demonstrate need and develop ambitious, yet attainable, objectives. Data does not exist in a vacuum and can be evaluated and transformed into insight only if it is contextualized with associated activities. This is best accomplished in collaboration with the Institutional Research (IR) office. The Association for Institutional Research’s aspirational statement “highlights the need for IR to serve a broader range of decision makers.”

We emphasize the critical need to incorporate fundamental knowledge of experimental and quasi-experimental design at the beginning of any grant project. In essence, grant projects are experiments—just not necessarily being performed in a laboratory. The design of any experiment is to introduce new conditions. The independent variable is the grant project and the dependent variable is the success of the target population (students, faculty). The ability to properly measure and replicate this scientific process must be established during project planning, and the IR office can be instrumental in the design of your evaluation.

Responding to a program solicitation (or RFP, RFA, etc.) provides the opportunity to establish the need for the project, measurable outcomes, and an appropriate plan for evaluation that can win over the hearts and minds of reviewers, and lead to a successful grant award. Institutional researchers work with the grant office not only to measure outcomes but also to investigate and provide potential opportunities for improvement. IR staff act as data scientists and statisticians while working with grants and become intimately acquainted with the data, collection process, relationships between variables, and the science being investigated. While the term statistician and data scientist are often used synonymously, data scientists do more than just answer hypothesis tests and develop forecasting models; they also identify how variables not being studied may affect outcomes. This allows IR staff to see beyond the questions that are being asked and not only contribute to the development of the results but also identify unexpected structures in the data. Finding alternative structure may lead to further investigation in other areas and more opportunities for other grants.

If a project’s objective is to affect positive change in student retention, it is necessary to know the starting point before any grant-funded interventions are introduced. IR can provide descriptive statistics on the student body and target population before the intervention. This historical data is used not only for trend analysis but also for validation, correcting errors in the data. Validation can be as simple as looking for differences between comparison groups and confirming potential differences are not due to error. IR can also assist with the predictive analytics necessary to establish appropriate benchmarks for measurable objectives. For example, predicting that an intervention will increase retention rates by 10-20% when a 1-2% increase would be more realistic could lead to a proposal being rejected or set the project up for failure. Your IR office can also help ensure that the appropriate quantitative statistical methods are used to analyze the data.

Tip: Involve your IR office from the beginning, during project planning. This will contribute greatly to submitting a competitive application, the evaluation of which provides the guidance necessary for a successful project.

Blog: Evaluation Plan Development for Grant Writing

Posted on March 25, 2015 by  in Blog (, )

Dean of Grants and Federal Programs, Pensacola State College

Creative Commons License This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.

As dean of institutional effectiveness and grants I have varied responsibilities, but at heart, I am a grant writer. I find it easy to write a needs statement based on available data; more challenging is the process of developing an effective evaluation plan for a proposed project.

A lot of time and effort – taxpayer supported – go into project evaluation, an increasingly significant component of federal grant applications, as illustrated by the following examples:

  • My college partners on two existing U.S. Department of Labor Trade Adjustment Assistance Community College and Career Training (TAACCCT) Grants – almost $2 billion nationally to expand training for the unemployed – which allow up to 10 percent of project budgets to be designated for mandatory external evaluation.
  • We have an $8.5 million U.S. Department of Health & Human Services Health Profession Opportunity Grant demonstration project. Part of that “demonstration” included mandatory participation in activities conducted by contracted external evaluators.

We recently submitted grant applications under the highly competitive U.S. Department of Education Student Support Services (SSS) Program. My college has a long-term SSS program, which meets all of its objectives so we’ll receive “extra” prior experience points. We are assured refunding, right? Maybe, as long as we address competitive preference priorities and score better than perfect – every point counts.

Although external evaluation is not required, when comparing language excerpted from the last three SSS competitions, it is clear that there is a much greater emphasis on the details of an evaluation plan. The guidelines require a detailed description of what types of data will be collected and how the applicant will use the information collected in the evaluation of project activities. It is no longer sufficient to just say “project staff will collect quantitative and qualitative data and use this information for project improvement.”

Our successful evaluation plans start with a detailed logic model, which allows us to make realistic projections of what we hope will happen and plan data collection around the project’s key activities and outcomes. We use these guiding questions to help formulate the details:

  • What services will be provided?
  • What can be measured?
    • perceptions, participation, academic progress
  • What information sources will be available?
  • What types of data will be collected?
    • student records, surveys, interviews, activity-specific data
  • How will we review and analyze the data collected?
  • What will we do with the findings?
    • Specific actions

Unlike universities, most community and state colleges are not hotbeds of research and evaluation. So what can grant writers do to prepare themselves to meet the “evaluation plan” challenge?

  • Make friends with a statistician; they tend to hang out in the Mathematics or Institutional Research departments.
  • Take a graduate level course in educational statistics. If you’re writing about something it is helpful to have rudimentary knowledge of what you write.
  • Find good resources. I have several textbook-like evaluation manuals, but my go-to, dog-eared guide for developing an evaluation plan is the National Science Foundation’s “2010 User-Friendly Handbook for Project Evaluation” (Logic Model information in Chapter 3).
  • An open-access list of Institutional Research (IR) Links located on the Association for Institutional Research website (AIR; a membership organization), provides more than 2200 links to external IR Web pages on a variety of topics related to data and decisions for higher education.
  • Community College Research Center (CCRC) resources, such as publications on prior research, can guide evaluation plan development (http://ccrc.tc.columbia.edu/). The CCRC FAQs Web page provides national data useful for benchmarking your grant program’s projected outcomes.