📈 Evidence Strength Assessment Guide
Framework for evaluating insight reliability, converting stakeholder input into validated project intelligence, and building credible evidence base that supports strong proposals and community-grounded project design.
🎯 Evidence Assessment Framework
Why Evidence Strength Matters
For Project Credibility:
- Funders require evidence-based analysis to support funding decisions
- Strong evidence base increases proposal success rates
- Credible analysis builds stakeholder confidence in project approach
- Evidence transparency allows others to assess and build on your work
For Implementation Success:
- Evidence strength indicates reliability for project planning decisions
- Strong evidence reduces risk of design errors based on weak assumptions
- Graduated evidence levels help prioritize where additional validation needed
- Evidence assessment guides ongoing learning and adaptation strategies
Evidence Categories System
EVIDENCE STRENGTH LEVELS:
STRONG EVIDENCE (E):
- Multiple credible sources confirm finding
- Specific data, examples, or quantitative support provided
- Consistent across different conversation contexts and settings
- Aligns with credible secondary research where available
- Source has direct experience/knowledge relevant to insight
MODERATE EVIDENCE (E*):
- Several sources mention finding with some variation
- Some specific examples provided but not comprehensive
- Generally consistent with some contextual differences noted
- Partially supported by secondary research or expert opinion
- Source has relevant but not direct experience
VALIDATED ASSUMPTION (A→E):
- Original assumption confirmed through stakeholder input
- Community language may differ from original framing
- Additional context provided that strengthens assumption
- Stakeholder validation reduces uncertainty about assumption
WORKING ASSUMPTION (A):
- Single source or limited confirmation
- Logical inference but not definitively supported
- Important hypothesis requiring testing during implementation
- Reasonable belief based on available information but needs validation
CONTRADICTED ASSUMPTION (A→X):
- Original assumption challenged by stakeholder input
- Community evidence suggests different reality
- Requires revision of original analysis based on stakeholder insights
📋 Source Credibility Assessment
Stakeholder Source Evaluation
High Credibility Sources:
✅ Direct Experience Indicators:
- Lives with or directly affected by the problem
- Has attempted solutions or interventions in the past
- Provides specific examples from personal/community experience
- Knowledge spans significant time period (historical context)
✅ Positional Knowledge:
- Professional role provides relevant insight (teachers on education, health workers on health)
- Community leadership position with broad perspective
- Organizational experience implementing similar projects
- Access to information others don't have
✅ Representativeness:
- Speaks for broader group through formal or informal role
- Perspective confirmed by others in similar position
- Demographic characteristics reflect broader affected population
- Geographic/cultural position representative of project area
Example High Credibility Sources:
- Community health worker with 5 years experience in target area
- Mother who has navigated health system for children's care
- Teacher working in local schools for multiple years
- Youth who has participated in previous employment programs
Moderate Credibility Sources:
⚪ Relevant but Indirect Experience:
- Works with affected population but not directly affected
- Has professional knowledge but limited local implementation experience
- Community member but in different situation than core problem focus
- Knowledge from formal training/education rather than direct experience
⚪ Positional Limitations:
- Organizational role may influence perspective (NGO staff, government official)
- Recent arrival to community with limited historical knowledge
- Demographic characteristics differ significantly from affected population
- Access to information filtered through institutional channels
Example Moderate Credibility Sources:
- NGO staff member working in area for 6 months
- Government official with regional rather than local responsibility
- Community member from different economic/social situation
- External expert with relevant technical knowledge but limited local experience
Lower Credibility Sources:
❌ Limited Direct Knowledge:
- Brief visit or short-term engagement with community
- Knowledge based on assumptions or second-hand information
- Demographic/social position very different from affected population
- Professional or institutional bias that limits perspective
❌ Agenda-Driven Input:
- Clear organizational or personal interest in promoting specific solutions
- Information provided to influence project design toward particular approach
- Selective sharing of information that supports preferred outcomes
- Historical conflicts of interest affecting objectivity
Note: Even lower credibility sources can provide valuable insights when triangulated with other sources and when their limitations are acknowledged
Information Quality Indicators
High Quality Information Characteristics:
Specificity:
✅ Provides concrete examples rather than general statements
✅ Includes numbers, timeframes, or specific contexts
✅ Details that can be verified or triangulated with other sources
✅ References specific events, places, or people (appropriately anonymized)
Consistency:
✅ Story remains consistent across different conversation contexts
✅ Details align with information from other credible sources
✅ Stakeholder demonstrates deep familiarity with topic
✅ Information fits logically with broader context and patterns
Nuance:
✅ Acknowledges complexity rather than oversimplifying
✅ Recognizes different perspectives or experiences within community
✅ Identifies conditions under which information applies vs doesn't
✅ Shows awareness of limitations of their own knowledge
🔍 Multi-Source Triangulation
Evidence Triangulation Process
Step 1: Source Mapping
For each key insight, identify:
Primary Sources: [Who provided this insight directly]
- Source A: [Name/type, credibility level, context of sharing]
- Source B: [Name/type, credibility level, context of sharing]
- Source C: [Name/type, credibility level, context of sharing]
Secondary Validation: [Who confirmed or contradicted this insight]
- Validator 1: [How they confirmed/challenged the insight]
- Validator 2: [Additional perspective on the insight]
Contextual Factors: [What affects reliability of this insight]
- Setting: [Formal interview, casual conversation, group discussion]
- Timing: [Recent experience, historical perspective, hearsay]
- Relationships: [Stakeholder relationship to problem, to organization]
Step 2: Triangulation Analysis
TRIANGULATION ASSESSMENT MATRIX:
Insight: [Specific insight being evaluated]
Source Agreement:
□ Multiple sources provide identical or very similar information
□ Sources provide similar information with contextual variations
□ Sources provide somewhat different perspectives on same underlying issue
□ Sources contradict each other significantly
Source Credibility:
□ All sources have high credibility and direct experience
□ Mix of high and moderate credibility sources
□ Primarily moderate credibility sources
□ Limited credibility sources but best available for this topic
Evidence Specificity:
□ Multiple sources provide specific examples and details
□ Some sources provide specifics, others more general
□ Generally vague information but consistent pattern
□ Limited specific supporting detail
Triangulation Conclusion:
□ STRONG EVIDENCE (E) - Multiple credible sources with specific confirmation
□ MODERATE EVIDENCE (E*) - Some credible sources with general confirmation
□ WORKING ASSUMPTION (A) - Limited sources but reasonable hypothesis
□ REQUIRES VALIDATION - Insufficient information for confidence
Contradiction Analysis Framework
When Sources Disagree:
CONTRADICTION INVESTIGATION:
Insight A: [First perspective on issue]
Source: [Who provided this perspective and their credibility]
Context: [Setting, timing, stakeholder characteristics]
Insight B: [Conflicting perspective on same issue]
Source: [Who provided conflicting perspective and their credibility]
Context: [Setting, timing, stakeholder characteristics]
Possible Explanations for Contradiction:
□ Different contexts - both insights true in different situations
□ Different time periods - situation has changed between perspectives
□ Different populations - insights reflect experiences of different groups
□ Different aspects - insights address different dimensions of same issue
□ Different reliability - one source more credible than other
□ Measurement differences - insights use different standards or definitions
Resolution Approach:
□ Accept both insights as reflecting different valid experiences
□ Investigate further through additional stakeholder conversations
□ Weight insights based on source credibility and specificity
□ Note contradiction as area requiring ongoing attention during implementation
Synthesis Decision:
[How contradiction was handled in theme development and why]
📊 Quantitative vs Qualitative Evidence
Integrating Different Evidence Types
Quantitative Evidence from Stakeholders:
High Value Quantitative Insights:
✅ Specific numbers stakeholders use in daily life
✅ Percentages or proportions they've observed directly
✅ Timeframes and frequencies from lived experience
✅ Costs, prices, or economic data from personal experience
Examples:
- "Transportation costs 30% of daily wages" (from daily wage workers)
- "Only 3 out of 20 graduates from last year found jobs" (from training program graduate)
- "Clinic is closed 2 days per week due to staff shortages" (from clinic user)
Validation Considerations:
- Stakeholder math may be approximate but experientially accurate
- Community counting systems may differ from formal systems
- Percentages may be impressionistic but still valuable
- Economic data reflects lived reality more than official statistics
Qualitative Evidence Integration:
Strong Qualitative Evidence:
✅ Consistent themes across multiple stakeholder conversations
✅ Emotional intensity around particular issues
✅ Stories that illustrate broader patterns
✅ Cultural or social dynamics explanations
Qualitative-Quantitative Connections:
- Use qualitative insights to interpret quantitative data
- Let stakeholder stories provide context for statistics
- Allow community priorities to weight statistical importance
- Use qualitative insights to identify what quantitative data to seek
Evidence Combination Strategies
Mixed Evidence Synthesis:
When combining different evidence types:
Complementary Evidence:
- Quantitative data supports qualitative themes
- Stakeholder stories explain statistical patterns
- Community examples illustrate broader trends
- Different evidence types reinforce same conclusions
Contradictory Evidence:
- Statistical data contradicts stakeholder experience
- Official reports differ from community observations
- Expert opinions conflict with community knowledge
- Different measurement approaches yield different conclusions
Resolution Approaches:
1. Weight community experience over official data when about lived reality
2. Use community insights to interpret why statistics may be misleading
3. Acknowledge both perspectives and explore implications
4. Plan additional validation during project implementation
🎯 Evidence Application Framework
Evidence-Based Theme Development
Theme Evidence Standards:
STRONG THEME (E):
Requires:
- Multiple credible sources provide confirming insights
- Specific examples and details support general pattern
- Consistent across different conversation contexts
- Some quantitative or concrete supporting information
- Cultural and contextual factors well understood
MODERATE THEME (E*):
Requires:
- Several sources mention pattern with some variation
- Some specific supporting examples available
- Generally consistent with acknowledged contextual differences
- Logical pattern supported by available evidence
- Areas for further validation identified
WORKING HYPOTHESIS THEME (A):
Acceptable when:
- Limited sources but pattern seems important
- Logical inference from available information
- Important for project design even if not fully validated
- Plan for validation during implementation exists
- Uncertainty acknowledged explicitly
Evidence Documentation Template
THEME EVIDENCE ASSESSMENT:
Theme: [Theme name and description]
SUPPORTING EVIDENCE:
Strong Evidence Sources:
- Source 1: [Name/type] - "[Key supporting quote]"
- Credibility: [High/Moderate/Lower] - [Why]
- Specificity: [Specific examples provided]
- Source 2: [Name/type] - "[Key supporting quote]"
- Credibility: [High/Moderate/Lower] - [Why]
- Specificity: [Specific examples provided]
Moderate Evidence Sources:
- [List sources with moderate support]
TRIANGULATION ANALYSIS:
Source Agreement Level: [High/Moderate/Limited]
Evidence Specificity: [Concrete/General/Vague]
Contextual Consistency: [Consistent/Variable/Contradictory]
CONTRADICTORY EVIDENCE:
Conflicting Insights: [Any contradictory information]
Contradiction Analysis: [How contradictions explored and resolved]
EVIDENCE STRENGTH CONCLUSION: [Strong(E) / Moderate(E*) / Working Hypothesis(A)]
VALIDATION PLAN:
□ No additional validation needed - evidence sufficient
□ Plan stakeholder validation of theme framing and language
□ Requires additional investigation during project implementation
□ Needs follow-up conversations with specific stakeholder groups
ACTION IMPLICATIONS:
How evidence strength affects use of this theme:
- [If Strong]: Primary focus for project design and intervention planning
- [If Moderate]: Important consideration with built-in learning and adaptation
- [If Working Hypothesis]: Area for early validation during implementation
🔄 Evidence Quality Improvement
Strengthening Weak Evidence
When Evidence is Insufficient:
Evidence Improvement Strategies:
Additional Stakeholder Engagement:
- Target specific stakeholder groups who could provide missing perspectives
- Use different engagement methods (individual vs group, formal vs informal)
- Include stakeholders with different demographic characteristics
- Seek out dissenting or minority voices
Secondary Source Integration:
- Review academic research that might support or challenge themes
- Consult organizational reports from similar contexts
- Include government or institutional data where relevant
- Cross-reference with other projects' findings in similar contexts
Pilot Testing Approaches:
- Design small tests of assumptions during project planning phase
- Use early implementation activities as validation opportunities
- Build learning and adaptation mechanisms into project design
- Plan systematic evidence gathering during initial project activities
Evidence-Based Decision Making
Using Evidence Assessment for Project Design:
Strong Evidence Themes (E):
- Primary focus for intervention design and resource allocation
- High confidence for proposal writing and funder communication
- Foundation for Theory of Change development
- Basis for partnership discussions and collaboration planning
Moderate Evidence Themes (E*):
- Important considerations for project design with built-in adaptation
- Areas for early validation during implementation
- Backup planning for alternative approaches if assumptions don't hold
- Collaborative investigation with stakeholders during implementation
Working Hypothesis Themes (A):
- Areas for investigation rather than immediate intervention
- Pilot testing opportunities during project startup
- Partnership opportunities for shared learning and validation
- Explicit uncertainty acknowledgment in project design and proposals
📈 Evidence Communication
Presenting Evidence Transparently
Evidence Transparency in Project Documents:
Proposal Writing:
- Clearly mark evidence levels (E, E*, A) in analysis sections
- Provide source attribution without compromising stakeholder confidentiality
- Acknowledge uncertainties and validation plans
- Include contradictory evidence rather than presenting false consensus
Stakeholder Communication:
- Share evidence assessment with communities for validation
- Explain how community input influenced evidence evaluation
- Acknowledge contributions while maintaining appropriate confidentiality
- Plan ongoing evidence sharing as project implementation provides additional validation
Funder Communication:
- Demonstrate analytical rigor through transparent evidence assessment
- Show systematic approach to community engagement and validation
- Acknowledge limitations while demonstrating strong overall evidence base
- Present evidence upgrading as project design strength rather than weakness
Evidence assessment transforms stakeholder input from anecdotal information into credible intelligence for project design. Transparent acknowledgment of evidence strength builds rather than undermines credibility, and provides foundation for adaptive implementation that continues learning and validation throughout project lifecycle.