We specialize in the systematic collection and analysis of client information to improve effectiveness and enable informed program decisions.
What is measured will be improved.
TechMIS provides evidence-based analysis through development of relevant, non-intrusive and quantifiable Measures of Performance and Effectiveness that are built during program planning, rigorous monitoring frameworks and hard statistical analysis.
This approach empowers decision makers to make real-time decisions and focuses program participants and stakeholders on key goals, resulting in better outcomes.
TechMIS provides innovative monitoring and evaluation in austere locations. Our approach is based on the scientific method where we conduct up-front research to develop hypotheses. We develop clear, quantifiable Measures of Performance and Effectiveness based on program goals to test these hypotheses. We then use high- and low-tech data collection and hard statistical analysis to honestly and accurately assess program effectiveness. Our combination of quantitative skills, quantification of qualitative data and synthesis of descriptive narratives from qualitative data that cannot be quantified provides a 360° view of program effectiveness to the client.
Our experts are skilled in statistical analysis, marketing, psychology, socio-cultural analysis and historical analysis. This allows us to tailor our approach to the client’s needs and the study population.
What can be seen can be changed.
Graphic presentation of our continuous data-collection and assessment ensures that managers and participants know how the project is performing. This makes good programs better and enables adjustments if goals are not being met. Some examples of our graphic techniques are heat maps, geo-located population spikes, connectivity graphs, network diagrams and fishbone diagrams.
Our continuous collection and assessment process empowers in-course corrections to capitalize on success or fleeting opportunities, adjust to changing situations and to fix or end non-performing projects.
Learning Only Occurs When Behavior Changes.
TechMIS has developed and trademarked our “Start Better, Get Better, Stay Better™” methodology to enable continuous learning.
Start Better.
Starting better means ensuring that past lessons from all sources are rigorously researched and presented to the client to improve their planning and preparations. No matter what the project, we have been there before. We call this process a Before-Action Review, or BAR. In it we synthesize past lessons in the form of historical records, existing and emerging best practices, briefings, organizational charts, literature and media reports, to name a few, and place them in context of the current project, extracting the critical, germane nuggets.
Few organizations use this critical BAR tool, opting instead to focus on the more familiar After-Action Review (AAR). We have found that using our BAR methodology increases capabilities from the very beginning and avoids wasted planning time and false preparations.
Get Better.
Getting better relies on assessing progress towards intended goals, researching past lessons as the situation evolves and the context becomes more clear.
Our ability to develop MoP and MoE before operations enables a rigorous assessment process which helps determine if our client is doing things right and doing the right things and disseminating it through the organization. This knowledge transfer ensures our client gets better throughout the effort, despite personnel and market turbulence.
Stay Better.
Disciplined, useful After-Action Reviews (AARs) capture lessons after operations. These lessons are synthesized with the lessons collected throughout operations to develop “final” lessons that are shared and archived for use in follow-on efforts. This provides fully-supportable lessons and solutions based on rigorous, scientifically supportable collection and analysis that can be implemented by the client.
Analytic Approaches:
Evaluate Theory of Change
“Are we doing the right things?”
Analysis Factors
- Development Outcomes
- Logic Trace
- Historical Successes/Challenges
- Cultural Appropriateness
- Resource Allocation
- Stakeholders
- ID Explicit and Implicit Assumptions
- Integration/Interaction With Other Programs
- Resource Absorption Rate and Required Threshold
Performance and Impact Evaluations
“Are we doing things right?”
Baseline and Longitudinal Monitoring
- Program Design
- Resource Allocation
- Attitudes, Beliefs, and Behavior
- Program Effectiveness
- Counterfactual Analysis
- Comparison With Expected Improvement Curve
Tools:
- In-country network development
- Direct and Indirect Observation
- F2F and Remote Surveys
- Focus Groups
- Advanced Statistical Analysis
- Curated Data and Reports
- Economic Assessments
- Social Assessments
- Direct Observation
Clients:
Special Operations Command, Department of Defense Joint Staff, Broadcasting Board of Governors, NASA, Boeing, Booz Allen Hamilton, Department of Veterans’ Affairs