Newsletter - Project Spotlight

Newsletter: Project Spotlight: Geospatial Technician Education – Unmanned Aircraft Systems & Expanding Geospatial Technician Education through Virginia’s Community Colleges

Posted on July 1, 2016 by  in Newsletter - () ()

Deputy Director, Virginia Space Grant Consortium

Chris Carter is the Deputy Director of the Virginia Space Grant Consortium, where he leads two ATE projects.

How do you use logic models in your ATE projects?

Our team recently received our fourth ATE award, which will support the development of academic pathways and faculty training in unmanned aircraft systems (UAS). UAS, when combined with geospatial technologies, will revolutionize spatial data collection and analysis.

Visualizing desired impacts and outcomes is an important first step to effective project management. Logic models are wonderful tools for creating a roadmap of key project components. As a principal investigator on two ATE projects, I have used logic models to conceptualize project outcomes and the change that our team desires to create. Logic models are also effective tools for articulating the inputs and resources that are leveraged to offer the activities that bring about this change.

With facilitation and guidance from our partner and external evaluator, our team developed several project logic models. We developed one overarching project logic model to conceptualize the intended outcomes and desired change of the regional project. Each community college partner also developed a logic model to capture its unique goals and theory of change while also articulating how it contributes to the larger effort. These complementary logic models allowed the team members to visualize and understand their contributions while ensuring everyone was on the same path.

Faculty partners used these logic models to inform their administrations, business partners, and employers about their work. They are great tools for sharing the vision of change and building consensus among key stakeholders.

Our ATE projects are focused on creating career pathways and building faculty competencies to prepare technicians. The geospatial and UAS workforce is a very dynamic employment sector that is constantly evolving. We find logic models helpful tools for keeping the team and partners focused on the desired outputs and outcomes. The models remind us of our goals and help us understand how the components fit together. It is crucial to identify the project inputs and understand that as these evolve, project activities also need to evolve. Constantly updating a logic model and understanding the relationships between the various sections are key pieces of project management.

I encourage all ATE project leaders to work closely with their project evaluators and integrate logic models. Our external evaluator was instrumental in influencing our team to adopt these models. Project evaluators must be viewed as team members and partners from the beginning. I cannot imagine effectively managing a project without the aid of this project blueprint.

Newsletter: Project Spotlight: Geospatial Technology Advantage Project

Posted on April 1, 2016 by  in Newsletter - ()

The Geospatial Technology Advantage: Preparing GST Technicians and GST-enabled Graduates for Southern Illinois Business and Industry, Lake Land College/Kaskaskia College

Mike Rudibaugh is principal investigator for the Geospatial Technology Advantage Project at Kaskaskia College. The project’s external evaluator is Elaine Craft of SC ATE.

How do your evaluation results come into play with regard to making decisions about your project?

External evaluation feedback has been critical to assist the grant team with evaluating our timelines relating to meeting the grant’s goals. This feedback has literally reshaped our grant team’s view of our goals, budget, and measurable outcomes that we could realistically achieve within the life cycle of the grant. Use of the external evaluator’s logic model has been critical to assist the grant team with tracking goals, objectives, and measurable outcomes for a successful grant.

Can you give an example of something you learned from your evaluation that led you to make a change in your project?

Using observations from our external evaluator’s site visit and report led to major shifts in the grant’s overall focus. Her recommendations suggested that a revised scope of work document needed to be submitted to the grant’s program officer. Simply put, the external evaluation process revealed that the grant’s goals and objectives were overly ambitious considering the institution was new to the ATE program, the original PI retired after year one, and the rural nature of the region was slow to adopt a dynamic new STEM field like geospatial technology. These recommendations and actions led to the development of a revised scope of work document outlining more achievable grant goals. They supported  a long-term approach to building a viable geospatial technology program through integration approaches with existing programs on campus.

What types of reports does your evaluator provide and what type of information do you find most useful?

Our evaluator produces annual written reports and also does one annual site visit each year. Each one of these events and resources assists the grant team with connecting the different elements of the grant together.  The evaluator often sees the big picture and at times really helps the college open doors and connect resources together to move the project forward. The ATE grant and evaluation process provided a platform to discuss program integration with faculty directing other programs potentially benefitting from integrating geospatial technology. Linking these success stories into the grant’s annual report has helped us grow more on-campus champions for STEM integration using geospatial technology across the curriculum. The annual site visit has been critical to develop these partnerships. The external evaluator helps cut through the often historic curriculum partnership barriers between transfer and occupation programs on community college campuses. Focusing on positive student outcomes like program completion and job attainment has assisted faculty to focus on issues benefitting students.

Newsletter: Project Spotlight: PATHTECH Successful Academic & Employment Pathways in Advanced Technologies

Posted on January 1, 2016 by  in Newsletter - () ()

Associate Professor of Sociology, University of South Florida

Will Tyson is PI for Path Tech, an ATE targeted research project. He is an associate professor of sociology at the University of South Florida. Learn more about his project at
www.sociology.usf.edu/pathtech/.

Q: What advice do you have for PIs who want to pursue targeted research in technician education?

The Targeted Research on Technician Education strand of ATE is an ideal avenue for current ATE PIs looking to fund small projects to learn more about student outcomes resulting from prior activities. The best advice I have is to seek out scholars with backgrounds in social science and education, preferably with NSF experience, to partner with on a targeted research submission.

Q: You’ve published numerous articles on your research. What is your sense of what journal editors and reviewers are looking for when it comes to research on technician education?

I’m not sure journal editors and reviewers are actually looking for research on technician education. This is both a challenge and an opportunity. Most STEM education research generally ignores the “T” and focuses on traditional pathways to science, engineering, and mathematics degrees and careers. I think people know “good tech jobs” exist, but generally lack knowledge about the educational pathways to those jobs and the rich life stories of community college students in technician education programs.

Q: How do you see ATE research fitting within the NSF-IES Common Guidelines for Education Research and Development?

I think there are some challenges to fitting ATE research into the Common Guidelines. There are several research types and ATE researchers have to be careful to make sure the type they choose is the best fit for their research questions. The Guidelines are a good start for new investigators, but senior investigators should continue to build upon their work and use prior research to justify their new research ideas.

Q: Based on your experience as an NSF proposer and reviewer, what are some common mistakes when it comes to targeted research proposals?

Everyone should pay close attention to the goals of the Targeted Research on Technician Education track as outlined in the ATE program solicitation, which are to simulate and support research on technician education and build the partnership capacity between 2- and 4-year institutions to design and conduct research and development projects. All projects should focus on studying education through partnerships between 2- and 4-year institutions. In my experience, targeted research proposals tend to be led by 2-year college faculty or scholars from 4-year institutions or private research institutes. The 2-year personnel tend to lack the capacity to conduct targeted research due to lack of experience or personnel, as evidenced by their biosketches. On the other hand, 4-year personnel tend to lack familiarity with 2-year colleges and seek to use students as “guinea pigs.” Proposals often do not show that the scholar will be able to recruit student participants. Targeted research proposals should show clear evidence that 2- and 4-year institutions or private research institutes are going to work collaboratively.

Newsletter: Project Spotlight: Manufacturing Associate Degree Education in Northwestern Connecticut

Posted on October 1, 2015 by  in Newsletter - ()

Professor, Biology, Northwestern Connecticut Community College

A conversation with Sharon Gusky, an ATE PI at Northwestern Connecticut Community College.

Q: Your ATE project started just over a year ago. What do you know now that you wish you’d known then about project evaluation?

A: I wish I had a better understanding of information that is useful to collect before the start of the grant so we would have been better prepared to capture baseline and impact data. This is our first NSF grant and it allowed us to start a new manufacturing program. The community was excited about and very supportive of it. The first year we received many requests to speak at events, do radio and cable TV shows, and visit high schools, but we did not have a way to capture the impact of these activities.

Q: What advice do you have for new PIs with regard to working with an evaluator?

A: Start working with your evaluator early and set clear timelines for checking in and reviewing and analyzing the data as it is collected. The information that you collect along the way can help shape the program. We learned early on through student interviews that they did not like the course schedule, which required them to wait a semester or summer to take the second technical course in a sequence.  We used their feedback to revise the schedule so that each course ran for eight weeks during a semester.  If we had waited until the end of the spring semester to find this out, it would have been too late to implement the change for fall.

Q: What challenges did you face in getting the evaluation off the ground?

A: We faced a number of scheduling challenges and miscommunication with regard to data collection.  We hadn’t clearly defined the roles of the various people involved—external evaluator, institutional research director, PI, and co-PIs.  We needed to sit down together and work out a plan so that the data we needed was being collected and shared.

Newsletter: Project Spotlight: E-MATE

Posted on July 1, 2015 by  in Newsletter - ()

Professor and chair of engineering and technology at Brookdale Community College, E-MATE

A conversation with Mike Qaissaunee, E-MATE’s principal investigator

Q: How did you work with your evaluator during proposal development?

A: As PI and an experienced evaluator, I wrote the initial plan and selected a longtime colleague to act as external evaluator. The proposal was funded with the understanding that we would select a new evaluator, as panelists felt the initial evaluator was too close to me (the PI) and would have difficulty being objective. We selected a new evaluator with significant experience with NSF, ATE, and community colleges. Through a number of calls and meetings, we discussed the proposal, detailed our goals and objectives, answered a number of really good questions, and identified the key things we hoped to learn. Our new evaluator was able to build on my original evaluation plan, developing a rich evaluation framework and logic model.

Q: What advice do you have for communicating an evaluation plan in a proposal?

A: As proposals are fairly short, it’s important to keep the evaluation plan brief and specific to the project, rather than boilerplate. If possible, communicate information in a table and/or graphic. Evaluation metrics and tasks can also be included in tables detailing timelines, activities, and goals and objectives.

Q: How did you integrate evaluation results from a prior project into your proposal?

A: I’ve found that the most powerful approach to including evaluation results in a proposal is a judicious mix of qualitative and quantitative data. Quantitative data demonstrates past success and capacity for future work, while qualitative results bring the proposal to life and engages readers. Evaluation results can also be used to highlight areas with limited success and new areas for investigation. I don’t shy away from addressing evaluation data as it demonstrates that the project team is learning and adapting.

Newsletter: Project Spotlight: MatEdu

Posted on April 1, 2015 by  in Newsletter - ()

Principal Investigator, MatEdu, Edmonds Community College

Mel Cossette is principal investigator for the National Resource Center for Materials Technology Education at Edmonds Community College. MatEdU’s mission is to advance materials technology education nationally. 

Q: What advice would you give to a PI in their first reporting period?

A: First, check the Reporting section of Research.gov to confirm the annual report due date. Sometimes a first time PI refers to their award date or start date, but it’s actually the due date listed on this website that is critical. Second, connect with your evaluator and inform him or her of the report due date. This helps with the planning and writing processes and assists with identifying information to be shared early on in the process. This does not mean things cannot change, but it is essential that the evaluator and PI communicate.

Q: How do you use your evaluation results in your annual report to NSF?

A: Typically, we create a rough draft of the annual report, from our perspective, which we share with our evaluator. The evaluator reviews and provides feedback. In the meantime, we continue building our report, paying attention to the different categories within the report, such as accomplishments, significant activities, products developed, etc. During this time, our evaluator develops a draft report that is shared with us. Although the reports have a different focus and are written using different formats, we compare content from the two reports. That helps us to be succinct with the data and information the reports are requiring. We find that this collaborative process helps to keep our team focused on the task at hand.

Q: What are some things that make an evaluation report useful (from a PI’s perspective)?

A: Because the information is coming from a semi-external perspective, we get the chance to compare the evaluation report on our activities, successes, areas that may need review, etc., to our activity timeline. This helps to limit scope creep. The recommendations from our evaluator also enabled us to identify a potential gap in our activities that needs to be addressed. PIs are usually completely focused on their projects and annual reports, so having an external evaluator point out successes, gaps, inconsistencies and data points reinforces progress and project direction.