Back to blog
evaluation plan
grant proposal examples
logic model
nonprofit grants
grant writing tips

Grant Proposal Evaluation Plan Example: Templates and Metrics That Funders Trust

Formative vs summative evaluation illustrated as tasting soup while cooking vs reviewing the final dish

GrantCue Team

Feb 24, 2026

4 min read

Build a grant proposal evaluation plan that funders trust. Covers formative vs. summative evaluation, logic model basics, internal vs. external evaluation, data collection methods, and includes a complete evaluation plan example with a metrics table.

Here's something nobody tells you until you've already submitted a proposal without one: evaluation plans are increasingly the section that separates funded proposals from rejected ones.

A decade ago, many foundation grants didn't require an evaluation plan at all. Today, even small family foundations want to know how you'll measure success. Federal grants have always demanded it, and the bar keeps rising — USDA, DOE, and HHS all released updated evaluation guidance in 2024-2025 that emphasizes rigorous, equity-centered evaluation design.

The good news? A strong evaluation plan isn't as complicated as it sounds. It's a structured answer to one question: How will you know if your project worked?

Formative vs. Summative Evaluation (In Plain English)

Every evaluation plan involves two types of evaluation. Don't let the jargon intimidate you — they're intuitive once you hear the analogy:

  • Formative evaluation = Checking the soup while it's cooking. You taste it, adjust the seasoning, add more salt. It happens during the program and helps you make real-time improvements.
  • Summative evaluation = Reviewing the restaurant after dinner. Did the meal meet expectations? Would you recommend it? It happens after the program (or at key milestones) and judges overall effectiveness.

Your evaluation plan needs both. Formative evaluation shows the funder you'll adapt if something isn't working. Summative evaluation proves you delivered results.

FormativeSummative
**When**During implementationAt the end (or at milestones)
**Purpose**Improve the programJudge the program
**Questions**Are we on track? What needs adjusting?Did we achieve our objectives?
**Methods**Staff feedback, attendance logs, mid-program surveysPre/post assessments, outcome data analysis, final reports
**Audience**Project team, advisory boardFunder, board, stakeholders

Logic Model Basics for Grant Proposals

Most federal grants and many foundation grants expect a logic model — a visual map of how your program creates change. Think of it as a flowchart:

Inputs → Activities → Outputs → Short-term Outcomes → Long-term Outcomes

Here's a practical example for a youth mentoring program:

ComponentExample
**Inputs**$50,000 grant, 2 staff, 25 volunteer mentors, meeting space, curriculum
**Activities**Recruit mentors, train mentors, match with youth, weekly 1-hour sessions, monthly group events
**Outputs**50 youth enrolled, 25 mentor pairs matched, 40 weekly sessions held, 10 group events
**Short-term Outcomes**80% of youth report increased sense of belonging; 75% show improved school attendance
**Long-term Outcomes**Reduced dropout rates; increased postsecondary enrollment among participants

The logic model isn't just a diagram — it's the skeleton of your entire evaluation plan. Every output and outcome in the model should appear as a measurable indicator in your evaluation framework.

Your objectives drive the logic model. If you haven't nailed those yet, our goals and objectives guide walks through the SMART framework step by step.

Internal vs. External Evaluation

Should you evaluate your own program, or hire someone else to do it?

Internal Evaluation

  • Best for: Smaller grants (under $100K), programs with straightforward metrics, formative evaluation
  • Pros: Lower cost, deeper program knowledge, faster feedback loops
  • Cons: Potential bias, may lack methodological expertise
  • Who does it: Program staff, a dedicated evaluator on staff, or a board member with research expertise

External Evaluation

  • Best for: Federal grants, grants over $100K, programs requiring rigorous outcome measurement, when the funder requires it
  • Pros: Objectivity, methodological rigor, credibility with funders
  • Cons: Higher cost ($5,000-$50,000+), less program context, slower turnaround
  • Who does it: University researchers, evaluation consulting firms, independent evaluators

Pro tip: Many mid-size grants use a hybrid approach — internal staff handle formative evaluation and data collection, while an external evaluator designs the framework and conducts the summative analysis. Budget for the external evaluator in your grant budget from day one.

Data Collection Methods by Program Type

Match your data collection to your program and your capacity:

Program TypeCommon Data SourcesCollection Methods
**Education**Test scores, attendance records, gradesSchool district data sharing, teacher surveys, pre/post assessments
**Health**Clinical metrics, screening results, utilization dataEHR extracts, patient surveys, claims analysis
**Workforce**Employment records, wage data, credential attainmentParticipant tracking, employer surveys, state wage database matching
**Youth Development**Social-emotional assessments, behavioral recordsValidated instruments (DESSA, SDQ), school records, participant interviews
**Environment**Water/air quality metrics, species counts, acreageField monitoring, satellite imagery, lab analysis

The key: use existing data sources whenever possible. Don't build a custom database when the school district already tracks attendance. Don't survey patients when the EHR already captures the metric.

Complete Evaluation Plan Example

Here's a full evaluation plan for a workforce development program. Use this as a template:

Program: CareerLaunch — Adult Workforce Training Initiative

Evaluation Approach: Mixed-methods evaluation combining quantitative outcome tracking with qualitative participant feedback. Internal program staff will manage data collection, with an external evaluator (Dr. Sarah Kim, Lakewood University) conducting quarterly analysis and the final summative report.

Evaluation Questions:

  1. To what extent did participants complete the training program and attain industry credentials?
  2. Did participants secure employment at target wage levels within 90 days?
  3. What program elements did participants identify as most valuable?
  4. Were outcomes equitable across demographic subgroups?

Metrics Table:

ObjectiveIndicatorData SourceMethodFrequencyTarget
80 adults complete trainingCompletion countProgram enrollment databaseStaff trackingMonthly80 (100%)
70% attain NIMS credentialCredential verificationNIMS registryRegistry queryUpon completion56 (70%)
65% employed within 90 daysEmployment statusParticipant follow-upPhone survey + employer verification90 days post-completion52 (65%)
Avg. wage of $18+/hrReported wagesParticipant self-report, pay stubsSurvey + document review90 days post-completion$18/hr avg.
Participant satisfactionSatisfaction scoreExit survey5-point Likert scale surveyAt program exit4.0+ average
Equitable outcomesCompletion/employment by demographicsProgram databaseDisaggregated analysisQuarterly + finalNo >10% gap between subgroups

Reporting Schedule:

  • Monthly: Internal data dashboard reviewed by program team
  • Quarterly: Progress report to funder with formative findings
  • Annual: Comprehensive summative report with recommendations

Data Management: All participant data stored in encrypted database with access limited to evaluation team. IRB exemption obtained through Lakewood University. Data retained for 5 years per federal requirements.

Connecting It All Together

Your evaluation plan doesn't exist in isolation. It's the accountability mechanism for every promise you make in the proposal:

  • Your statement of need defines the problem → your evaluation measures whether you reduced it
  • Your objectives set the targets → your evaluation tracks progress toward them
  • Your budget funds the evaluation → your evaluation justifies the expenditure

That circular connection is exactly what funders look for. For the full picture of how every section connects, see our complete grant proposal guide.

GrantCue's compliance tracking helps you stay on top of evaluation milestones, reporting deadlines, and deliverable schedules — so you never miss a quarterly report or data collection window.

Build the evaluation plan before you write the narrative. It sounds backwards, but when you know how you'll measure success, every other section writes itself.

Share this article