Gordon Snyder

Co-Principal Investigator, Op-Tec Center

Gordon F. Snyder, Jr. is past Executive Director and Principal Investigator for the sunsetted National Center for Information and Communications Technologies (ICT Center) at Springfield Technical Community College (STCC) in Massachusetts. He now serves as a Co Principal Investigator with the National Center for Optics and Photonics Education (OP-TEC) in Waco, Texas. At STCC he helped develop the Verizon Next Step program and continues to serve as a telecommunications curriculum consultant for the program. He is the author of four engineering and engineering technology textbooks and has over 20 years of consulting experience in the field of software development, communications and LAN/WAN design. In 2001 he was selected as one of the top fifteen technology faculty in the United States by Microsoft Corporation and the American Association of Community Colleges and in 2004 was selected as the Massachusetts Networking and Communications Council Workforce Development Leader of the year. He is well known in the social media space with his content followed by thousands.
He has a strong interest in evaluation and the development of accessible tools evaluators and PI’s can use for communications and the formative enhancement of project impacts.


Blog: Some of My Favorite Tools and Apps

Posted on May 20, 2015 by  in Blog (, )

Co-Principal Investigator, Op-Tec Center

Creative Commons License This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.

The Formative Assessment Systems for ATE project (FAS4ATE) focuses on assessment practices that serve the ongoing evaluation needs of projects and centers. Determining these information needs and organizing data collection activities is a complex and demanding task, and we’ve used logic models as a way to map them out. Over the next five weeks, we offer a series of blog posts that provide examples and suggestions of how you can make formative assessment part of your ATE efforts. – Arlen Gullickson, PI, FAS4ATE

Week 5 – How am I supposed to keep track of all this information?

I’ve been involved in NSF center and project work for over 17 years now. When it comes to keeping track of information from meetings, having a place to store evaluation data, and tracking project progress, there are a few habits and apps I’ve found particularly useful.

Habit: backing up my files
All the apps I use are cloud-based, so I can access my files anywhere, anytime, with any device. However, I also use Apple’s Time Machine to automatically back up my entire system on a daily basis to an external hard drive. I also have three cloud-based storage accounts (Dropbox, Google Drive, and Amazon Cloud Drive). When the FAS4ATE files on Dropbox were accidentally deleted last year, I could upload my backup copy and we recovered everything with relative ease.

Habit + App: Keeping track of notes with Evernote
I’ve been using Evernote since the beta was released in 2008 and absolutely love it. If you’ve been in a meeting with me and you’ve seen me typing away – I’m not emailing or tweeting – I’m taking meeting notes using Evernote. Notes can include anything: text, pictures, web links, voice memos, etc., and you can attach things like word documents, spreadsheets, etc. Notes are organized in folders and are archived and searchable from any connected devices. There are versions available for all of the popular operating systems and devices, and notes can easily be shared among users. If it’s a weekend and I’m five miles off the coast fishing and you call me about a meeting we had seven years ago, guess what? With a few clicks I can do a search from my phone, find those notes, and send them to you in a matter of seconds. Evernote has both a free, limited version and inexpensive, paid version.

App: LucidChart
When we first started with the FAS4ATE project, we thought we’d be developing our own cloud-based logic model dashboard-type app. We decided to start by looking at what was out there, so we investigated lots of project management apps like Basecamp. We tried to force Evernote into a logic model format; we liked DoView. However, at this time we’ve decided to go with LucidChart. LucidChart is a web-based diagramming app that runs in a browser and allows multiple users to collaborate and work together in real time. The app allows in-editor chat, comments, and video chat. It is fully integrated with Google Drive and Microsoft Office 2013 and right now appears to be our best option for collaborative (evaluator, PI, etc.) logic model work. You may have seen this short video logic model demonstration.

As we further develop our logic model-based dashboard, we’ll be looking for centers and projects to pilot it. If you are interested in learning more about being a pilot site, contact us by emailing Amy Gullickson, one of our co-PIs, at amy.gullickson@unimelb.edu.au. We’d love to work with you!

Blog: Figures at Your Fingertips

Posted on October 28, 2014 by  in Blog ()

Co-Principal Investigator, Op-Tec Center

Creative Commons License This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.

In formative evaluation, programs or projects are typically assessed during their development or early implementation to provide information about how best to revise and modify for improvement. (www.austinisd.org)

 I’m Gordon Snyder and I’m currently the principal investigator of the National Center for Information and Communications Technologies (ICT Center). My experience as new ATE center PI back in July of 2001 did not get off to a very smooth start. With the retirement of the founding PI after three years, I was moving from a co-PI position and faced with making a number of decisions and changes in a short period of time.  We were right in the middle of the “dot com bust” and the information and communications technology field was in free fall. I knew our decisions needed to be data-driven, smart, focused, quick, and correct if we were going to continue to be a resource for faculty and students.

As a center co-PI during the boom times between 1998 and 2000, my role was focused on curriculum development and helping faculty learn and teach new technology in their classrooms and labs. I honestly did not understand nor pay much attention to the work our evaluator was doing – that was something the PI liked to handle, and I was perfectly fine with that.

In my new role as a PI, things changed. One of the first things I did was read the evaluation reports for the past two years. I found a lot of flowery complimentary language with little else in those reports – I recall using the term “pile of fluff” along with a few others that I won’t repeat here. I found nothing substantial that was going to help me making any decisions.

In August of 2001, I received our year 3 annual evaluation report and this one was even more “fluffy.” Lesson learned: Within a month I dismissed that evaluator, replacing that individual with someone more in tune with what we needed. Things were much better with the new evaluator, but I still found it difficult making intelligent data-based decisions.  I did not have the information I needed. There had to be a better way.

Fast forward to today: ATE PIs need even more access to valid, reliable, useful, evaluative data for decision making. This data needs to be available in real time, or close to real time throughout the funding cycle, more frequently than the typical annual evaluation reports. However, most PIs still simply do not have the time, resources, and expertise required to systematize the collection and use of this kind of information.

Logic models are one method that’s catching on to keep track of and use information to make formative data-driven decisions. I’ve been working on the FAS4ATE project (Formative Assessment for ATE) with Western Michigan University that will ultimately develop some logic model-based online tools to streamline data collection and more effectively scope and plan evaluation activities to include formative and summative processes. We’re early in the development process. Here’s a short video demonstrating our prototype of one of the tools.

Logic models are a great way to keep up with project work and more quickly and confidently make data-based decisions. If you’d like to learn more about this formative assessment project, contact me at gordonfsnyder@gmail.com

Newsletter: Meet EvaluATE’s Community College Liaison Panel

Posted on January 1, 2014 by , , , in Newsletter - ()

The ATE program is community college-based, and as such EvaluATE places a priority on meeting the needs of this constituency. To help ensure the relevancy and utility of its resources, EvaluATE has convened a Community College Liaison Panel (CCLP). CCLP members Michael Lesiecki, Marilyn Barger, Jane Ostrander, and Gordon Snyder are tasked with keeping the EvaluATE team tuned into the needs and concerns of 2-year college stakeholders and engaging the ATE community in the review and pilot testing of EvaluATE-produced materials.

These resources distill relevant elements of evaluation theory, principles, and best practices so that a user can quickly understand and apply them for a specific evaluation-related task. They are intended to support members of the ATE community to enhance the quality of their evaluations.

The CCLP’s role is to coordinate a three-phase review process. CCLP members conduct a first-level review of an EvaluATE resource. The EvaluATE team revises it based on the CCLP’s feedback, then each of the four CCLP members reaches out to diverse members of the ATE community—PIs, grant developers, evaluators, and others—to review the material and provide confidential, structured feedback and suggestions. After another round of revisions, the CCLP engages another set of ATE stakeholders to actually try out the resource to ensure it “works” as intended in the real world. Following this pilot testing, EvaluATE finalizes the resource for wide dissemination.

The CCLP has shepherded two resources through the entire review process: the ATE Evaluation Primer and ATE Evaluation Planning Checklist. In the hopper for review in the next few months are the ATE Logic Model Template and Evaluation Planning Matrix, Evaluation Questions Checklist, ATE Evaluation Reporting Checklist, and Professional Development Feedback Survey Template. In addition, CCLP members are leading the development of a Guide to ATE Evaluation Management—by PIs for PIs.

The CCLP invites anyone interested in ATE evaluation to participate in the review process. For a few hours of your time, you’ll get a first look at and tryout of new resources. And your inputs will help shape and strengthen the ATE evaluation community. We also welcome recommendations of tools and materials that others have developed that would be of interest to the ATE community.

To get involved, email CCLP Director Mike Lesiecki at mlesiecki@gmail.com. Tell him you would like to help make EvaluATE be the go-to evaluation resource for people like yourself.

Webinar: Making Evaluation Integral to your ATE Proposal

Posted on July 21, 2010 by , , , , , in Webinars ()

Presenter(s): Gordon Snyder, Karl Kapp, Linnea Fletcher, Lori Wingate, Peggie Weeks, Stephanie Evergreen
Date(s): July 21, 2010
Recording: https://vimeo.com/13577194

In this free, 90-minute webinar, participants will learn how to make evaluation a strong component of their ATE proposals. Staff from the ATE Evaluation Resource Center will provide guidance about how to focus an ATE evaluation, develop a plan for data collection and analysis, describe the evaluation in a proposal, and work with an evaluator.

The webinar will feature NSF-ATE program officer Linnea Fletcher, who will provide NSF’s perspective on these topics, Gordon Snyder and Karl Kapp, a veteran ATE PI-evaluator team, will also join the webinar, talking about their successful experiences working together on funded ATE proposals.

Resources:
Slide PDF
Handout PDF