We EvaluATE - Evaluation Use

Blog: Evaluation’s Role in Retention and Cultural Diversity in STEM

Posted on October 28, 2015 by  in Blog ()

Research Associate, Hezel Associates

Creative Commons License This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.

Recently, I attended the Building Pathways and Partnerships in STEM for a Global Network conference, hosted by the State University of New York (SUNY) system. It focused on innovative practices in STEM higher education, centered on increasing retention, completion, and cultural diversity.

As an evaluator, it was enlightening to hear about new practices being used by higher education faculty and staff to encourage students, particularly students in groups traditionally underrepresented in STEM, to stay enrolled and get their degrees. These included:

  • Research opportunities! Students should be exposed to real research if they are going to engage in STEM. This is not only important for four-year degree students, but also community college students, whether they plan to continue their education or move into the workforce.
  • Internships (PAID!) are crucial for gaining practical experience before entering the workforce.
  • Partnerships, partnerships, partnerships. Internships and research opportunities are most useful if they are with organizations outside of the school. This means considerable outreach and relationship-building.
  • One-on-one peer mentoring. Systems where upper level students work directly with new students to help them get through tough classes or labs has been shown to keep students enrolled not only in STEM programs, but in college in general.

The main takeaway from this conference is that the SUNY system is being more creative in engaging students in STEM. They are making a concerted effort to help underrepresented students. This trend is not limited to NY—many colleges and universities are focusing on these issues.

What does all this mean for evaluation? Evidence is more important than ever to sort out what types of new practices work and for whom. Evaluation designs and methods need to be just as innovative as the programs they are reviewing. As evaluators, we need to channel program designers’ creativity and apply our knowledge in useful ways. Examples include:

  • Being flexible. Many methods are brand new or new to the institution or department, so implementers may tweak them along the way. Which means we need to pay attention to how we assess outcomes, perhaps taking guidance from Patton’s Developmental Evaluation work.
  • Considering cultural viewpoints. We should always be mindful of the diversity of perspectives and backgrounds when developing instruments and data collection methods. This is especially important when programs are meant to improve underrepresented groups’ outcomes. Think about how individuals will be able to access an instrument (online, paper) and pay attention to language when writing questionnaire items. The American Evaluation Association provides useful resources for this: http://aea365.org/blog/faheemah-mustafaa-on-pursuing-racial-equity-in-evaluation-practice/
  • Thinking beyond immediate outcomes. What do students accomplish in the long-term? Do they go on to get higher degrees, do they get jobs that fit with their expectations? If you can’t measure these due to budget or timeline constraints, help institutions design ways to do this themselves. It can help them continue to identify program strengths and weaknesses.

Keep these in mind, and your evaluation can provide valuable information for programs geared to make a real difference.

Blog: Changing Focus Mid-Project

Posted on September 30, 2015 by  in Blog ()

Physics Instructor, Spokane Falls Community College

Creative Commons License This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.

Along with co-PIs Michelle Moore and Max Josquin, I am a recent recipient of an NSF ATE grant aimed at increasing female enrollment and retention in my college’s Information Systems (IS) program. Our year one activities included creating a daylong Information Technology (IT) camp for incoming eighth and ninth grade young women.

LogoCamp

In our original plan, we had set aside money for five IS college students to help us for eight hours during the summer camp. We decided to meet weekly with the students during the months leading up to our event to stay on task and schedule.

1st surprise: Nine students showed up to the initial meeting, and eight of those remained with us for the project’s duration.

2nd surprise: Instead of waiting for our guidance, the students went off and did their own research and then presented a day-long curriculum that would teach hardware, software, and networking by installing and configuring the popular game Minecraft on Raspberry Pi microcomputers.

MineCraft

3rd surprise: When asked to think about marketing, the students showed us a logo and a flyer that they had already designed. They wanted T-shirts with the new logo for each of the campers. And they wanted each camper to be able to take home their Raspberry Pi.

ConfiguringRaspberryPi

At this point, it was very clear to my colleagues and I that we should take a step back and let the students run the show. We helped them create lesson plans to achieve the outcomes they wanted, but they took ownership of everything else. We had to set up registration and advertising, but on the day of the camp, the students were the ones in the classroom teaching the middle-graders. My colleagues and I were the gofers who collected permission slips, got snacks ready, and picked up pizza for lunch.

Perhaps our biggest surprise came when our external evaluator, Terryll Bailey, showed us the IS college student survey results:

“96.8% of the volunteers indicated that participating as a Student Instructor increased their confidence in teamwork and leadership in the following areas:

  • Taking a leadership role.
  • Drive a project to completion.
  • Express your point of view taking into account the complexities of a situation.
  • Synthesize others’ points of view with your ideas.
  • Ability to come up with creative ideas that take into account the complexities of the situation.
  •  Help a team move forward by articulating the merits of alternative ideas or proposals.
  • Engage team members in ways that acknowledge their contributions by building on or synthesizing the contributions of others.
  • Provide assistance or encouragement to team members.

All eight (100%) indicated that their confidence increased in providing assistance or encouragement to team members.”

For year two of our grant, we’re moving resources around in order to pay more students for more hours. We are partnering with community centers and middle schools to use our IS college students as mentors. We hope to formalize this such that our students can receive internship credits, which are required for their degree.

Our lessons learned during this first year of the grant include being open to change and being willing to relinquish control. We are also happy that we decided to work with an external evaluator, even though our grant is a small grant for institutions new to ATE. Because of the questions our evaluator asked, we have the data to justify moving resources around in our budget.

If you want to know more about how Terryll and I collaborated on the evaluation plan and project proposal, check out this webinar in which we discuss how to find the right external evaluator for your project: Your ATE Proposal: Got Evaluation?.

You may contact the author of this blog entry at: asa.bradley@sfcc.spokane.edu

Blog: Creation, Dissemination, and Accessibility of ATE-Funded Resources

Posted on July 15, 2015 by , in Blog (, )
Creative Commons License This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.
Bouda
Kendra Bouda,
Metadata and Information Specialist – Internet Scout Research Group
University of Wisconsin-Madison
Bower
Rachael Bower,
Director/PI – Internet Scout Research Group
University of Wisconsin-Madison

As most ATE community members are aware, the National Science Foundation requires that all grant applicants provide a one- to two-page data management plan describing how the grantee’s proposal will meet NSF guidelines on the dissemination of grant-funded work. In 2014, NSF added a new requirement to the ATE solicitation mandating that newly funded grantees archive their deliverables with ATE Central.

We were curious to find out more about the materials created within the ATE community. So, when EvaluATE approached us about including questions related to data management planning and archiving in their annual survey of ATE grantees, we jumped at the chance. We had an interest in discovering not only what resources have been created, but also how those resources are disseminated to larger audiences. Additionally, we hoped to discover whether grantees are actively making their materials web accessible to users with disabilities—a practice that ensures access by the broadest possible audience.

The survey responses highlight that the most widely created materials include (not surprisingly) curriculum and professional development materials, with newsletters and journal articles taking up the rear. Other materials created by the ATE community include videos, white papers and reports, data sets, and webinars.

However, although grantees are creating a lot of valuable resources, they may not be sharing them widely and, in some cases, may be unsure of how best to make them available after funding ends. The graphs below illustrate the available of these materials, both currently and after grant funding ends.

Bouda Chart

Data from the annual survey shows that 65 percent of respondents are aware of accessibility standards—specifically Section 508 of the Rehabilitation Act; however, 35 percent are not. Forty-eight percent of respondents indicated that some or most of their materials are accessible, while another 22 percent reported that all materials generated by their project or center adhere to accessibility standards. Happily, only 1 percent of respondents reported that their materials do not adhere to standards; however, 29 percent are unsure whether their materials adhere to those standards or not.

For more information about accessibility, visit the official Section 508 site, the World Wide Web Consortium’s (W3C) Accessibility section or the Web Content Accessibility Guidelines 2.0 area of W3C.

Many of us struggle with issues related to sustaining our resources, which is part of the reason we are all asked by NSF to create a data management plan. To help PIs plan for long-term access, ATE Central offers an assortment of free services. Specifically, ATE Central supports data management planning efforts, provides sustainability training, and archives materials created by ATE projects and centers, ensuring access to these materials beyond the life of the project or center that created them.

For more about ATE Central, check out our suite of tools, services, and publications or visit our website. If you have questions or comments, contact us at info@atecentral.net.

Blog: Examining the Recruitment and Retention of Underrepresented Minority Students in the ATE Program

Posted on June 10, 2015 by  in Blog ()

Doctoral Associate, EvaluATE, Western Michigan University

Creative Commons License This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.

One of the objectives NSF has for its funded programs is to broaden participation in STEM. Broadening participation entails increasing the number of underrepresented minorities (URMs) in STEM programs of study, employment, and research. NSF defines URMs as “women, persons with disabilities, and three racial/ethnic groups—blacks, Hispanics, and American Indians” (NSF, 2013, p. 2). Lori Wingate and I recently wrote a paper examining the strategies used by ATE grantees for recruiting and retaining URM students and how effective they perceive these strategies to be. Each year, the annual ATE survey collects demographic data on student enrollment in ATE programming. We noticed that when compared with national data, ATE programs were doing well when it came to enrolling URM students, especially African-American students. So we decided to investigate what strategies ATE programs were using to recruit and to retain these students. This study was based on data from the 2013 survey of ATE principal investigators.

Our survey asked about 10 different strategies. The strategies were organized into a framework consisting of three parts: motivation and access, social and academic support and affordability[1] as presented in the figure below. The percentages and associated bars represent the proportion of grantees who reported using a particular strategy. The gray lines and orange dots represent the rank of perceived impact, where 1 is the highest reported impact and 10 is the lowest.

Corey Chart

Overall, we found that ATE projects and centers were using strategies related to motivation and access more than those related to either social/academic support or affordability. These types of strategies are also more focused on recruiting students as opposed to retaining them. It was interesting that there was a greater emphasis on recruitment strategies, particularly because many of these strategies ranked low in terms of perceived impact. In fact, when we compared the overall perceptions of effectiveness to the actual use of particular strategies, we found that many of the strategies perceived to have the highest impact on improving the participation of URM students in ATE programs were being used the least.

Although based on the observations of a wide range of practitioners who are engaged in the daily work of technological education, perceptions of impact are just that, perceptions; the findings must be interpreted with caution. These data raise the question of whether or not ATE grantees are using the most effective strategies available to them for increasing the participation of URM students.

With the improving economy, enrollment at community colleges is down, putting programs with low enrollment at risk of being discontinued. This makes it ever more important not only to continue to enhance the recruitment of students to ATE programs, but also to use effective and cost-efficient strategies to retain them from year to year.

[1] Hrabowski, F. A., et al. (2011). Expanding underrepresented minority participation: America’s science and technology talent at the crossroads. Washington, DC: National Academies Press. Available here (to access this report by the National Academies of Sciences, you must create a free account)

Blog: Adapting Based on Feedback

Posted on May 13, 2015 by  in Blog ()

Director, South Carolina Advanced Technological Education Center of Excellence, Florence-Darlington Technical College

Creative Commons License This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.

The Formative Assessment Systems for ATE project (FAS4ATE) focuses on assessment practices that serve the ongoing evaluation needs of projects and centers. Determining these information needs and organizing data collection activities is a complex and demanding task, and we’ve used logic models as a way to map them out. Over the next five weeks, we offer a series of blog posts that provide examples and suggestions of how you can make formative assessment part of your ATE efforts. – Arlen Gullickson, PI, FAS4ATE

Week 4 – Why making changes based on evidence is important

At the Mentor-Connect: Leadership Development and Outreach for ATE project (www.Mentor-Connect.org), formative feedback guides the activities we provide and resources we develop. It is the compass that keeps us heading in the direction of greatest impact. I’ll share three examples of how feedback in the different stages of the project’s life cycle helped us adapt the project. The first was feedback from an outside source; the second two were based on our internal feedback processes.

Craft LM1 Pic

The initial Mentor-Connect technical assistance workshop for each cohort focuses on developing grant writing skills for the NSF ATE program. The workshop was originally designed to serve teams of two STEM faculty members from participant colleges; however, we were approached by grant writers from those colleges who also wanted to attend. On a self-pay basis, we welcomed these additional participants. Post-workshop surveys and conversations with grant writers at the event indicated that during the workshop we should offer a special breakout session just for grant writers so that issues specific to their role in the grant development and submission process could be addressed. This breakout session was added and is now integral to our annual workshop.

Craft LM2 Pic

Second, feedback from our mentors about our activities caused us to change the frequency of our face-to-face workshops. Mentors reported that the nine-month time lag between the project’s January face-to-face workshop with mentors and the college team’s submission of a proposal the following October made it hard to maintain momentum. Mentors yearned for more face-to-face time with their mentees and vice versa. As a result, a second face-to-face workshop was added the following July. Evaluation feedback from this second gathering of mentors and mentees was resoundingly positive. This second workshop is now incorporated as a permanent part of Mentor-Connect’s annual programming.

Craft LM3 pic

Finally, one of our project outputs helps us keep our project on track. We use a brief reporting form that indicates a team’s progress along a grant development timeline. Mentors and their mentees independently complete and submit the same form. When both responses indicate “ahead of schedule” or “on time” or even “behind schedule,” this consensus is an indicator of good communications between the mentor and his or her college team. They are on the same page. If we observe a disconnect between the mentee’s and mentor’s progress reports, this provides an early alert to the Mentor-Connect team that an intervention may be needed with that mentee/mentor team. Most interventions prompted by this feedback process have been effective in getting the overall proposal back on track for success.

With NSF ATE projects, PIs have the latitude and are expected to make adjustments to improve project outcomes. After all, it is a grant and not a contract. NSF expects you to behave like a scientist and adjust based on evidence. So, don’t be glued to your original plan! Change can be a good thing. The key is to listen to those who provide feedback, study your evaluation data, and adjust accordingly.

Blog: Finding Opportunity in Unintended Outcomes

Posted on April 15, 2015 by  in Blog (, , , )

Research and Evaluation Consultant, Steven Budd Consulting

Creative Commons License This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.

Working with underage students bears an increased responsibility for their supervision. Concerns may arise during the implementation of activities that were never envisioned when the project was designed. These unintended consequences may be revealed during an evaluation, thus presenting an opportunity for PIs and evaluators to both learn and intervene.

One project I’m evaluating includes a website designed for young teens, and features videos from ATETV and other sources. The site encourages our teen viewers to share information about the site with their peers and to explore links to videos hosted on other popular sites like YouTube. The overarching goal is to attract kids to STEM and technician careers by piquing their interest with engaging and accurate science content. What we didn’t anticipate was the volume of links to pseudoscience, science denial, and strong political agendas they would encounter. The question for the PI and Co-PIs became, “How do we engage our young participants in a conversation about good versus not-so-good science and how to think critically about what they see?”

As the internal project evaluator, I first began a conversation with the project PI and senior personnel around the question of responsibility. What is the responsibility of the PIs to engage our underage participants in a conversation about critical thinking and learning, so they can discriminate between questionable and solid content? Such content is readily accessible to young teens as they surf the Web, so a more important question was how the project team might capture this reality and capitalize on it. In this sense, was a teaching moment at hand?

As evaluators on NSF-funded projects, we know that evaluator engagement is critical right from the start. Formative review becomes especially important when even well-designed and well thought out activities take unanticipated turns. Our project incorporates a model of internal evaluation, which enables project personnel to gather data and provide real-time assessment of activity outcomes. We then present the data with comment to our external evaluator. The evaluation team works with the project leadership to identify concerns as they arise and strategize a response. That response might include refining activities and how they are implemented or by creating entirely new activities that address a concern directly.

After thinking it through, the project leadership chose to open a discussion about critical thinking and science content with the project’s teen advisory group. Our response was to:

  • Initiate more frequent “check-ins” with our teen advisers and have more structured conversations around science content and what they think.
  • Sample other teen viewers as they join their peers in the project’s discussion groups and social media postings.
  • Seek to better understand how teens engage Internet-based content and how they make sense of what they see.
  • Seek new approaches to activities that engage young teens in building their science literacy and critical thinking.

Tips to consider

  • Adjust your evaluation questions to better understand the actual experience of your project’s participants, and then look for the teaching opportunities in response to what you hear.
  • Vigilant evaluation may reveal the first signs of unintended impacts.