Michael Lesiecki

Luka Partners LLC

Dr. Michael Lesiecki has over 20 years of experience championing collaborative-driven development, educational program growth, assessment, and advocacy. His federal grants development, management, and evaluation experience includes proposals and projects up to $20M. With a Ph.D. in Physical Chemistry and collaborative industry experience, Dr. Lesiecki is uniquely knowledgeable about STEM education and high tech domains. Over the past two decades he has been deeply involved with the National Science Foundation’s Advanced Technological Education program acting as Principal Investigator, External Evaluator and Peer Reviewer. He now serves as the Principal of Luka Consulting LLC, a firm focused on evaluation services.


Blog: The Business of Evaluation: Liability Insurance

Posted on January 11, 2019 by  in Blog ()

Luka Partners LLC

Creative Commons License This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.

Bottom line: you may need liability insurance, and you have to pay for it.

The proposal has been funded, you are the named evaluator, you have created a detailed scope of work, and the educational institution has sent you a Professional Services Contract to sign (and read!).

This contract will contain many provisions, one of which is having insurance. I remember the first time I read it: The contractor shall maintain commercial general liability insurance against any claims that might incur in carrying out this agreement. Minimum coverage shall be $1,000,000.

I thought, well, this probably doesn’t pertain to me, but then I read further: Upon request, the contractor is required to provide a Certificate of Insurance. That got my attention.

You might find what happened next interesting. I called the legal offices at the community college. My first question was Can we just strike that from the contract? No, we were required by law to have it. Then she explained, “Mike that sort of liability thing is mostly for contractors coming to do physical work on our campus, in case there was an injury, brick falling on the head of a student, things like that.” She lowered her voice. “ I can tell you we are never going to ask you to show that certificate to us.”

However, sometimes, you will be asked to maintain and provide, on request, professional liability insurance, also called errors and omissions insurance (E&O insurance) or indemnity insurance. This protects your business if you are sued for negligently performing your services, even if you haven’t made a mistake. (OK, I admit, this doesn’t seem likely in our business of evaluation.)

Then the moment of truth came. A decent-sized contract arrived from a major university I shall not name located in Tempe, Arizona, with a mascot that is a devil with a pitchfork. It said if you want a purchase order from us, sign the contract and attach your Certificate of Insurance.

I was between the devil and a hard place. Somewhat naively, I called my local insurance agent (i.e., for home and car.) He actually had never heard of professional liability insurance and promised to get back to me. He didn’t.

I turned to Google, the fount of all things. (Full disclosure, I am not advocating for a particular company—just telling you what I did.) I explored one company that came up high in the search results. Within about an hour, I was satisfied that it was what I needed, had a quote, and typed in my credit card number. In the next hour, I had my policy online and printed out the one-page Certificate of Insurance with the university’s name as “additional insured.” Done.

I would like to clarify one point. I did not choose general liability insurance because there is no risk to physical damage to property or people that may be caused by my operations. In the business of evaluation that is not a risk.

I now have a $2 million professional liability insurance policy that costs $700 per year. As I add clients, if they require it, I can create a one-page certificate naming them as additional insured, at no extra cost.

Liability insurance, that’s one of the costs of doing business.

Three Common Evaluation Fails and How to Prevent Them

Posted on December 4, 2018 by , in Webinars ()

Presenter(s): Kirk Knestis, Michael Lesiecki
Date(s): January 30, 2019
Time: 1:00-2:00 p.m. Eastern

In this webinar, experienced STEM education evaluator Kirk Knestis will share strategies for effectively communicating with evaluation clients to avoid three common “evaluation fails.” (1) Project implementation delays; (2) evaluation scope creep (clients wanting something more or different from what was originally planned); and (3) substantial changes in the project over the course of the evaluation. These issues are typical causes for an evaluation to be derailed and fail to produce useful and valid results. Webinar participants will learn how clear documentation—specifically, an evaluation contract (legal commitment to the work), scope of work (detailed description of evaluation services and deliverables), and study protocol (technical details concerning data collection and analysis)—can make potentially difficult conversations go better for all involved, averting potential evaluation crises and failures. Getting these documents right and using them in project communications helps ensure a smoothly operating evaluation, happy client, and profitable project for the evaluator

For a sneak peek of some of what Kirk will address in this webinar, see his blogpost, http://www.evalu-ate.org/blog/knestis-apr18/.

Presenter: Kirk Knestis
Moderator: Mike Lesiecki

Register

Resources:

Webinar: Give Your Proposal A Competitive Edge with a Great Evaluation Plan

Posted on July 17, 2018 by , in Webinars ()

Presenter(s): Lori Wingate, Michael Lesiecki
Date(s): August 22, 2018
Time: 1:00-2:00 p.m. Eastern
Recording: https://youtu.be/Y5FJooZ913w

A strong evaluation plan will give your proposal a competitive edge. In this webinar, we’ll explain the essential elements of an effective evaluation plan and show you how to incorporate them into a proposal for the National Science Foundation’s Advanced Technological Education program. We’ll also provide guidance on how to budget for an evaluation, locate a qualified evaluator, and use evaluative evidence to describe the results from prior NSF support (required if you’ve had previous NSF funding). Participants will receive an updated Evaluation Planning Checklist for ATE Proposals and other resources to help prepare strong evaluation plans.

Resources:
Slides
Webinar Questions Answered Post Event
ATE Evaluation Plan Checklist
ATE Evaluation Plan Template
Guide to Finding and Selecting an ATE Evaluator
ATE Evaluator Map
Evaluation Data Matrix
NSF Evaluator Biosketch Template
NSF ATE Program Solicitation

Webinar: Evaluation Basics for Non-evaluators

Posted on February 1, 2018 by , , in Webinars ()

Presenter(s): Elaine Craft, Lori Wingate, Michael Lesiecki
Date(s): March 14, 2018
Time: 1:00-2:00 p.m. Eastern
Recording: https://youtu.be/Zb4oQZe7HtU

Abstract:

e · val · u · a · tion: determination of the value, nature, character, or quality of something or someone*

But what is program evaluation?

Why does the National Science Foundation (NSF) require that the projects they fund be evaluated? How much does it cost? Who can do it? What does a good evaluation plan look like? What will happen? What are you supposed to do with the results?

In this webinar, we’ll answer these and other common questions about program evaluation. This session is for individuals with limited experience with program evaluation, especially two-year college faculty and grants specialists who are planning on submitting proposals to NSF’s Advanced Technological Education program this fall.

*merrian-webster.com

Get your copy below!

Resources:
Handout
Slides
Webinar Q&A
1. ATE Program Overview
2. Evaluation Responsibility Matrix
3. Evaluation Timeline
4. Evaluation Process Overview
5. Example Project Logic Model
NEXT WEBINAR: Creating One-Pager Reports

Webinar: Evaluation: All the Funded ATE Proposals Are Doing It

Posted on August 10, 2017 by , in Webinars ()

Presenter(s): Lori Wingate, Mike Lesiecki
Date(s): August 16, 2017
Time: 1:00-2:00 p.m. Eastern
Recording: https://youtu.be/7ytTEGt_FoM

Give your proposal a competitive edge with a strong evaluation plan. The National Science Foundation has issued a new solicitation for its Advanced Technological Education (ATE) program. It includes major changes to the guidelines for ATE evaluation plans. Attend this webinar to learn the key elements of a winning evaluation plan and strategies for demonstrating to reviewers that evaluation is an integral part of your project, not an afterthought. In addition, we’ll provide you with specific guidance for how to budget for an evaluation, locate a qualified evaluator, and describe results from prior NSF support with supporting evaluative evidence. You will receive an updated and other tools to help prepare strong evaluation plans.

Resources:
Slides
ATE Proposal Evaluation Plan Template
Data Collection Planning Matrix
Evaluator Biographical Sketch Template for National Science Foundation (NSF) Proposals
Evaluation Planning Checklist for ATE Proposals
Evaluation Questions Checklist for Program Evaluation
Guide to Finding and Selecting an Evaluator
Logic Models: Getting them Right and Using them Well [webinar]
Logic Model Template for ATE Projects and Centers
NSF Prior Support Checklist
Small-Scale Evaluation Webinar

Blog: Breaking Up is Hard to Do

Posted on January 7, 2015 by  in Blog ()

Luka Partners LLC

Creative Commons License This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.

I am a PI for an ATE national resource center and an ATE project. Over the past 15 years, I have dealt with seven different evaluators. The management of that evaluator/evaluation relationship is important. Attention to certain details can pay off in the long run.

There is a lot of excitement when a grant proposal is funded. All of that work seems justified. Personnel are assigned and the contractual relationship with the evaluator begins. Although everything is rosy, now is the time to anticipate the possibility that the evaluator relationship may go south for a variety of reasons.

It is critical to have a good statement of work that will support the contract you will create. You can stand behind that statement and its deliverables and dates. That contract has many provisions, one of which is the exit clause.

An exit or termination clause is standard practice in business. Let’s face it, life happens, things may not work as intended, the evaluation effort may slip on the evaluator’s side due to competing priorities, overload, poor scheduling, or factors beyond anyone’s control. We are human.

Case #1: We had an evaluator from a local university. It started off fine but before too long it seemed she thought she was doing us a favor by allowing us to work with her. The evaluation came through as half-hearted and stilted, sounding like boilerplate. Still, we had to have an evaluator and we made it through the first year. In year two, she let us know that she was so busy that she could only devote one half the time at twice her rate. We invoked the exit clause that read, “This contract may be terminated for convenience giving contractor 15 days written notice of termination.” She informed us that we could not fire her and she would notify the NSF. We beat her to it by notifying the NSF first; they supported our decision. They just asked to see the new evaluator’s qualifications when available.

Case #2: The evaluator relationship was good but a report slipped, then the second report really slipped, and we missed appending the evaluation report to our annual report. We had a frank conversation; the problem was just shear overload. We mutually agreed to use the exit clause. I sent the letter by registered mail to record the delivery signature. We remain friends with the evaluator today.

Case #3: A colleague’s project evaluator was struggling to make deliverables. The exit clause was invoked, but the evaluator was very reluctant to release the accumulated project data. This was placing the whole evaluation effort at risk. The new evaluator helped intervene and ultimately the data was transferred.

Lessons: Stay way on top of the deliverables and timeline. If the evaluator misses a deadline and does not give you any warning – that is a “tell.” Act quickly, don’t overreact, invoke the exit clause if you need to, and don’t just “hope” it will get better. Move on and secure another evaluator.

Resources:

Principal Investigator “To-Do” Checklist: Before Launching Your Project Evaluation

Negotiating Agreements Checklist

Webinar: Orientation to ATE Survey 2014

Posted on January 22, 2014 by , , , in Webinars ()

Presenter(s): Corey Smith, Krystin Martens, Lori Wingate, Mike Lesiecki
Date(s): January 22, 2014
Recording: https://vimeo.com/84959802

In this webinar, EvaluATE staff will help ATE grantees prepare for the upcoming annual ATE survey (which takes place February 18 – March 18). We will provide a brief overview of the survey and administration process, address frequently asked questions (both substantive and technical), and clarify definitions. A crosswalk of ATE data that compares information grantees need to include in the annual survey, their NSF annual reports submitted through research.gov, and  project-level evaluations will help webinar participants anticipate information needs and streamline data collection and reporting.

Resources:
Slide PDF
ATE Survey 2014 FAQs
National Science Foundation Annual Report Components
2014 ATE Annual Survey

Newsletter: Meet EvaluATE’s Community College Liaison Panel

Posted on January 1, 2014 by , , , in Newsletter - ()

The ATE program is community college-based, and as such EvaluATE places a priority on meeting the needs of this constituency. To help ensure the relevancy and utility of its resources, EvaluATE has convened a Community College Liaison Panel (CCLP). CCLP members Michael Lesiecki, Marilyn Barger, Jane Ostrander, and Gordon Snyder are tasked with keeping the EvaluATE team tuned into the needs and concerns of 2-year college stakeholders and engaging the ATE community in the review and pilot testing of EvaluATE-produced materials.

These resources distill relevant elements of evaluation theory, principles, and best practices so that a user can quickly understand and apply them for a specific evaluation-related task. They are intended to support members of the ATE community to enhance the quality of their evaluations.

The CCLP’s role is to coordinate a three-phase review process. CCLP members conduct a first-level review of an EvaluATE resource. The EvaluATE team revises it based on the CCLP’s feedback, then each of the four CCLP members reaches out to diverse members of the ATE community—PIs, grant developers, evaluators, and others—to review the material and provide confidential, structured feedback and suggestions. After another round of revisions, the CCLP engages another set of ATE stakeholders to actually try out the resource to ensure it “works” as intended in the real world. Following this pilot testing, EvaluATE finalizes the resource for wide dissemination.

The CCLP has shepherded two resources through the entire review process: the ATE Evaluation Primer and ATE Evaluation Planning Checklist. In the hopper for review in the next few months are the ATE Logic Model Template and Evaluation Planning Matrix, Evaluation Questions Checklist, ATE Evaluation Reporting Checklist, and Professional Development Feedback Survey Template. In addition, CCLP members are leading the development of a Guide to ATE Evaluation Management—by PIs for PIs.

The CCLP invites anyone interested in ATE evaluation to participate in the review process. For a few hours of your time, you’ll get a first look at and tryout of new resources. And your inputs will help shape and strengthen the ATE evaluation community. We also welcome recommendations of tools and materials that others have developed that would be of interest to the ATE community.

To get involved, email CCLP Director Mike Lesiecki at mlesiecki@gmail.com. Tell him you would like to help make EvaluATE be the go-to evaluation resource for people like yourself.

Webinar: The Nuts and Bolts of ATE Evaluation Reporting

Posted on May 15, 2013 by , , , in Webinars ()

Presenter(s): Jason Burkhardt, Krystin Martens, Lori Wingate, MATE – Marine Advanced Technology Education Center, Michael Lesiecki
Date(s): May 15, 2013
Time: 1:00 p.m. EDT
Recording: https://vimeo.com/66343717

In this webinar, we will give practical advice about evaluation reporting in the ATE context, including report content and structure, integrating evaluation report content into annual reports to NSF, and using results. We will provide step-by-step guidance for developing an ATE evaluation report that balances the competing demands that reports be both comprehensive and concise. We’ll discuss the how-where-and-what of including evaluation results in NSF annual reports and project outcome reports. Finally we’ll address how to use evaluation results to inform project-level improvements and build the case for further funding. Participants will leave the webinar with a clear strategy for creating effective ATE evaluation reports that meet NSF accountability requirements and support project-level improvement. *This webinar was in inspired, in part, by the work of Jane Davidson, author of Evaluation Methodology Basics: The Nuts and Bolts of Sound Evaluation (Sage, 2005).

Resources:
Slide PDF
Handout PDF
Example highlights report : MATE Program Highlights