Mike Lesiecki

Executive Director, Maricopa Advanced Technology Education Center

Michael Lesiecki, is the Executive Director of the Maricopa Advanced Technology Education Center at the Maricopa Community Colleges in Tempe, Arizona. Michael is the principal investigator for a large grant from the National Science Foundation’s Advanced Technological Education program. He has 27 peer-reviewed journal publications and one patent. He received his PhD in Physical Chemistry from Oregon State University. He was a Research Professor at the University of Utah and an Associate Professor at the University of Puerto Rico. At Exxon Research and Engineering, Dr. Lesiecki worked as a Senior Scientist and at Candela Laser Corporation he was the Director of the Bioscience Division.

Dr. Lesiecki currently serves on proposal review committees for the NSF, Department of Education and Department of Labor. Lesiecki was a member of the NSF’s Committee of Visitors in 2012. The committee provides NSF with external expert review of quality and outcomes. Lesiecki has managed major grants from the National Science Foundation and the National Institutes of Health over the past 23 years. He now serves as the Chair of the Community College Liaison Panel for EvaluATE, the Evaluation Resource Center at Western Michigan University.

Webinar: Evaluation: All the Funded ATE Proposals Are Doing It

Posted on August 10, 2017 by , in Webinars ()

Presenter(s): Lori Wingate, Mike Lesiecki
Date(s): August 16, 2017
Time: 1:00-2:00 p.m. Eastern
Recording: https://youtu.be/7ytTEGt_FoM

Give your proposal a competitive edge with a strong evaluation plan. The National Science Foundation has issued a new solicitation for its Advanced Technological Education (ATE) program. It includes major changes to the guidelines for ATE evaluation plans. Attend this webinar to learn the key elements of a winning evaluation plan and strategies for demonstrating to reviewers that evaluation is an integral part of your project, not an afterthought. In addition, we’ll provide you with specific guidance for how to budget for an evaluation, locate a qualified evaluator, and describe results from prior NSF support with supporting evaluative evidence. You will receive an updated and other tools to help prepare strong evaluation plans.

ATE Proposal Evaluation Plan Template
Data Collection Planning Matrix
Evaluator Biographical Sketch Template for National Science Foundation (NSF) Proposals
Evaluation Planning Checklist for ATE Proposals
Evaluation Questions Checklist for Program Evaluation
Guide to Finding and Selecting an Evaluator
Logic Models: Getting them Right and Using them Well [webinar]
Logic Model Template for ATE Projects and Centers
NSF Prior Support Checklist
Small-Scale Evaluation Webinar

Blog: Breaking Up is Hard to Do

Posted on January 7, 2015 by  in Blog ()

Executive Director, Maricopa Advanced Technology Education Center

Creative Commons License This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.

I am a PI for an ATE national resource center and an ATE project. Over the past 15 years, I have dealt with seven different evaluators. The management of that evaluator/evaluation relationship is important. Attention to certain details can pay off in the long run.

There is a lot of excitement when a grant proposal is funded. All of that work seems justified. Personnel are assigned and the contractual relationship with the evaluator begins. Although everything is rosy, now is the time to anticipate the possibility that the evaluator relationship may go south for a variety of reasons.

It is critical to have a good statement of work that will support the contract you will create. You can stand behind that statement and its deliverables and dates. That contract has many provisions, one of which is the exit clause.

An exit or termination clause is standard practice in business. Let’s face it, life happens, things may not work as intended, the evaluation effort may slip on the evaluator’s side due to competing priorities, overload, poor scheduling, or factors beyond anyone’s control. We are human.

Case #1: We had an evaluator from a local university. It started off fine but before too long it seemed she thought she was doing us a favor by allowing us to work with her. The evaluation came through as half-hearted and stilted, sounding like boilerplate. Still, we had to have an evaluator and we made it through the first year. In year two, she let us know that she was so busy that she could only devote one half the time at twice her rate. We invoked the exit clause that read, “This contract may be terminated for convenience giving contractor 15 days written notice of termination.” She informed us that we could not fire her and she would notify the NSF. We beat her to it by notifying the NSF first; they supported our decision. They just asked to see the new evaluator’s qualifications when available.

Case #2: The evaluator relationship was good but a report slipped, then the second report really slipped, and we missed appending the evaluation report to our annual report. We had a frank conversation; the problem was just shear overload. We mutually agreed to use the exit clause. I sent the letter by registered mail to record the delivery signature. We remain friends with the evaluator today.

Case #3: A colleague’s project evaluator was struggling to make deliverables. The exit clause was invoked, but the evaluator was very reluctant to release the accumulated project data. This was placing the whole evaluation effort at risk. The new evaluator helped intervene and ultimately the data was transferred.

Lessons: Stay way on top of the deliverables and timeline. If the evaluator misses a deadline and does not give you any warning – that is a “tell.” Act quickly, don’t overreact, invoke the exit clause if you need to, and don’t just “hope” it will get better. Move on and secure another evaluator.


Principal Investigator “To-Do” Checklist: Before Launching Your Project Evaluation

Negotiating Agreements Checklist

Webinar: Orientation to ATE Survey 2014

Posted on January 22, 2014 by , , , in Webinars ()

Presenter(s): Corey Smith, Krystin Martens, Lori Wingate, Mike Lesiecki
Date(s): January 22, 2014
Recording: https://vimeo.com/84959802

In this webinar, EvaluATE staff will help ATE grantees prepare for the upcoming annual ATE survey (which takes place February 18 – March 18). We will provide a brief overview of the survey and administration process, address frequently asked questions (both substantive and technical), and clarify definitions. A crosswalk of ATE data that compares information grantees need to include in the annual survey, their NSF annual reports submitted through research.gov, and  project-level evaluations will help webinar participants anticipate information needs and streamline data collection and reporting.

Slide PDF
ATE Survey 2014 FAQs
National Science Foundation Annual Report Components
2014 ATE Annual Survey

Newsletter: Meet EvaluATE’s Community College Liaison Panel

Posted on January 1, 2014 by , , , in Newsletter - ()

The ATE program is community college-based, and as such EvaluATE places a priority on meeting the needs of this constituency. To help ensure the relevancy and utility of its resources, EvaluATE has convened a Community College Liaison Panel (CCLP). CCLP members Michael Lesiecki, Marilyn Barger, Jane Ostrander, and Gordon Snyder are tasked with keeping the EvaluATE team tuned into the needs and concerns of 2-year college stakeholders and engaging the ATE community in the review and pilot testing of EvaluATE-produced materials.

These resources distill relevant elements of evaluation theory, principles, and best practices so that a user can quickly understand and apply them for a specific evaluation-related task. They are intended to support members of the ATE community to enhance the quality of their evaluations.

The CCLP’s role is to coordinate a three-phase review process. CCLP members conduct a first-level review of an EvaluATE resource. The EvaluATE team revises it based on the CCLP’s feedback, then each of the four CCLP members reaches out to diverse members of the ATE community—PIs, grant developers, evaluators, and others—to review the material and provide confidential, structured feedback and suggestions. After another round of revisions, the CCLP engages another set of ATE stakeholders to actually try out the resource to ensure it “works” as intended in the real world. Following this pilot testing, EvaluATE finalizes the resource for wide dissemination.

The CCLP has shepherded two resources through the entire review process: the ATE Evaluation Primer and ATE Evaluation Planning Checklist. In the hopper for review in the next few months are the ATE Logic Model Template and Evaluation Planning Matrix, Evaluation Questions Checklist, ATE Evaluation Reporting Checklist, and Professional Development Feedback Survey Template. In addition, CCLP members are leading the development of a Guide to ATE Evaluation Management—by PIs for PIs.

The CCLP invites anyone interested in ATE evaluation to participate in the review process. For a few hours of your time, you’ll get a first look at and tryout of new resources. And your inputs will help shape and strengthen the ATE evaluation community. We also welcome recommendations of tools and materials that others have developed that would be of interest to the ATE community.

To get involved, email CCLP Director Mike Lesiecki at mlesiecki@gmail.com. Tell him you would like to help make EvaluATE be the go-to evaluation resource for people like yourself.

Webinar: The Nuts and Bolts of ATE Evaluation Reporting

Posted on May 15, 2013 by , , , in Webinars ()

Presenter(s): Jason Burkhardt, Krystin Martens, Lori Wingate, MATE – Marine Advanced Technology Education Center, Mike Lesiecki
Date(s): May 15, 2013
Time: 1:00 p.m. EDT
Recording: https://vimeo.com/66343717

In this webinar, we will give practical advice about evaluation reporting in the ATE context, including report content and structure, integrating evaluation report content into annual reports to NSF, and using results. We will provide step-by-step guidance for developing an ATE evaluation report that balances the competing demands that reports be both comprehensive and concise. We’ll discuss the how-where-and-what of including evaluation results in NSF annual reports and project outcome reports. Finally we’ll address how to use evaluation results to inform project-level improvements and build the case for further funding. Participants will leave the webinar with a clear strategy for creating effective ATE evaluation reports that meet NSF accountability requirements and support project-level improvement. *This webinar was in inspired, in part, by the work of Jane Davidson, author of Evaluation Methodology Basics: The Nuts and Bolts of Sound Evaluation (Sage, 2005).

Slide PDF
Handout PDF
Example highlights report : MATE Program Highlights

Webinar: Build a Better ATE Proposal with Evaluation and Logic Models

Posted on August 15, 2012 by , , , in Webinars ()

Presenter(s): Connie Della-Piana, Jason Burkhardt, Mike Lesiecki, Penny Billman
Date(s): August 15, 2012
Recording: https://vimeo.com/47686001

A grant proposal that includes a strong evaluation plan linked to a sound project logic model will be reviewed more favorably than one that does not. Will it make or break your chances for funding? No, but attending to evaluation matters as you develop your proposal, especially in terms of how you will assess your project’s intellectual merit and broader impacts, is likely to strengthen your overall proposal and give you a competitive edge. Considering your proposal through an evaluative lens as you are crafting it can help you avoid common proposal pitfalls, such as writing goals that are either too lofty or too simplistic or failing to demonstrate a logical relationship between your activities and your intended outcomes. In this webinar, we’ll share two tools specifically developed for ATE proposers: a checklist for developing evaluation plans for ATE proposals and a template for creating simple, yet powerful project logic models. There will be ample time with our expert panel, which includes an ATE evaluator, PI, and program officer.

By the end of the webinar, participants will
1. Understand how to prepare an ATE proposal that meets NSF’s requirements for evaluation
2. Know how to establish a working relationship with an external evaluator and what to expect from him or her (both before and after award)
3. Be able to create a logic model to convey your proposed project’s activities and intended outcomes
4. Understand the role of evaluation in ATE projects and how to align an evaluation plan with project goals

Slide PDF
Evaluation Planning Checklist for ATE Proposals

Webinar: Strong Evaluation Plans = Stronger Proposals

Posted on July 20, 2011 by , , , , in Webinars ()

Presenter(s): Elizabeth Teles, Lori Wingate, Mike Lesiecki, Norena Badway, Stephanie Evergreen
Date(s): July 20, 2011
Recording: https://vimeo.com/26728898

It’s that time of year again, when we are just a couple of short months away from the due date for the next round of ATE proposals. Join us as we review the elements of an ATE proposal’s evaluation component and how to use it to strengthen your submission. We’ll discuss how to tie evaluation tasks to the grant’s goals and objectives and how to be sure the evaluation is responsive to NSF’s expectations for ATE projects and centers. Wondering how to incorporate evaluation into your budget? Need advice on how you can convey that you’ll use evaluation for project improvement? This webinar will help you integrate evaluation into your project work and clearly discuss the project-evaluation relationship in your proposal.

In this webinar, Liz Teles (of Teles Consulting and former co-lead for NSF’s ATE program) will share some helpful hints and fatal flaws related to evaluation plans in ATE proposals. Check out the one-page and expanded versions of her 10 Helpful Hints and 10 Fatal Flaws

Slide PDF
Handout PDF
10 Helpful Hints and 10 Fatal Flaws: Writing Better Evaluation Sections in Your Proposals (long)
10 Helpful Hints and 10 Fatal Flaws: Writing Better Evaluation Sections in Your Proposals (short)