In today’s environment, you need data and information quickly to make smart decisions. At Assay|Edu, we apply a rapid-cycle approach to all evaluation activities and provide you with insights through informal and formal channels. Our goal is to help you make quick adjustments and smart changes that will improve program efficacy.

Each evaluation is custom-developed based on the needs of the program, administrators, and funders; our typical evaluation plan may include any of the following components:

A needs assessment is a systematic process that determines and addresses gaps between current conditions and ideal outcomes. We often begin our engagements with a complete needs assessment to help crystallize where you are and where you’re going.

We create a custom, graphically driven logic model to illustrate the linkages between program inputs, activities (outputs), and short-, mid- and long-term outcomes. The logic model serves as the advanced organizer for the evaluation plan.

The process-monitoring component examines how well the program components and strategies are being implemented. Data collection methods often include qualitative techniques, such as surveys, focus groups, interviews, observations, and monitoring dashboards.

Rapid-cycle program evaluation employs rigorous, scientific-based research methods to assess incremental program changes in real time. You modify a program component, and then quickly find out if your change was effective. From there, you continue modifying based on the cycle of rapid evaluation feedback. The goal is to leverage existing data for speedy, continuous program improvement. This means you don’t need to wait until the end of the year, or the end of a lengthy analytical process, to learn how your program can better serve students and the community.

A summative component addresses the question of the effectiveness of the project in achieving the short-, mid- and long-term outcomes while identifying the program features/components that are most and least effective. Data may include quantitative program data, school district data, state educational data, state workforce data, and/or national data.

We go after big data to find relationships, patterns, and trends that predict desired outcomes. With data from schools, districts, state, and national databases, Assay|Edu builds predictive models to make data-driven hypotheses about the best future course of action. Predictive analytics allows for data-driven, insightful decision making.

We like to describe our formal reporting practices by telling you what you won’t get: a massive, dense written report filled with inscrutable charts and graphs that will collect dust on a shelf for years to come.


Instead, we prepare a comprehensive, graphic-filled, and actionable presentation that clearly spells out our findings and emphasizes recommendations. Our reports are all modular in nature, so sections can easily be used to socialize findings whenever and wherever.


We also employ informal reporting, whereby critical findings are disseminated immediately via emails, conference calls, meeting, or memos. We don’t make you wait to hear important findings!


Our #1 goal is to see your program improve and thrive; our reports are designed with that goal at the top of mind.