14  Process Optimization & Organizational Efficiency

14.1 Optimizing for Impact

Business analysts and public health analysts share a fundamental responsibility: identifying inefficiencies and optimizing processes to maximize program impact. Whether measuring return on investment (BA) or population health outcomes (PH), both domains must demonstrate tangible value. In public health, that value must ultimately be apparent to the taxpayer funding these programs.

This chapter synthesizes strategies from psychology, sociology, project management, informatics, and organizational behavior to create a comprehensive approach to process optimization.

14.1.1 The Dual Framework

BA Perspective PH Perspective
Process Improvement Program Optimization
Operational Efficiency Resource Stewardship
ROI Demonstration Taxpayer Value
Automation Strategy Scalable Interventions
Change Management Implementation Science

14.1.2 The Optimization Hierarchy

Before optimizing a process, apply this decision framework:

flowchart TD
    A[Identify Manual Process] --> B{Mission Critical?}
    B -->|No| C[Eliminate]
    B -->|Yes| D{Can It Be Automated?}
    D -->|Yes| E[Automate with Oversight]
    D -->|No| F[Standardize & Document]
    E --> G{Scalable?}
    F --> G
    G -->|Yes| H[Implement]
    G -->|No| I[Reassess Requirements]
Figure 14.1: Process Optimization Decision Tree
ImportantCore Principle

Automation and scale should be prioritized above aesthetics. A functional, scalable system that processes 10,000 records reliably is more valuable than a beautifully designed system that handles 100. Invest in robustness first; polish later.

14.2 Foundations of Organizational Efficiency

Process optimization does not happen in a vacuum. Organizations are complex systems where psychological, social, and structural factors interact. Understanding these foundations helps analysts anticipate barriers and design interventions more likely to succeed.

Each section below identifies common barriers alongside practical remedies. These tables are not exhaustive but represent patterns frequently encountered in both BA and public health contexts.

14.2.1 Psychological Foundations

Understanding human motivation and well-being is essential for sustainable process improvement. Optimizing systems without considering the people who operate them leads to burnout, resistance, and ultimately failure.

14.2.1.1 Meaning and Purpose (Frankl)

Victor Frankl’s logotherapy, developed from his experiences surviving Nazi concentration camps, posits that the primary human drive is the search for meaning. People who find purpose in their work can overcome extraordinary barriers; those who do not will struggle to sustain motivation beyond basic needs.

Principle Workplace Implication
Will to meaning People seek work that matters, not just paychecks
Meaning through contribution Staff must see how their work affects outcomes
Suffering with purpose Difficult work becomes tolerable when meaningful
Choice of attitude Even constrained roles allow choice in how we engage

Application to Public Health Programs:

Public health work is inherently meaningful: protecting communities, preventing disease, saving lives. Yet this meaning can become invisible when staff are buried in data entry, compliance paperwork, or bureaucratic processes. Leaders must actively connect daily tasks to population-level impact.

ImportantThe Meaning Imperative

People who experience a strong sense of meaning at work can overcome many barriers to achieve impact. Conversely, if staff feel their contributions lack purpose, motivation will not exceed what is required to pay bills and care for family.

Make impact direct and observable. Ensure your team understands not just what they do, but why it matters and who benefits.

14.2.1.2 Self-Determination Theory (SDT)

Deci and Ryan’s Self-Determination Theory identifies three innate psychological needs that drive motivation:

Need Description Process Implication
Autonomy Control over one’s work Allow flexibility in how tasks are completed
Competence Mastery and effectiveness Provide training, feedback, and achievable challenges
Relatedness Connection to others Foster team collaboration and shared purpose

SDT complements Frankl’s insights: autonomy, competence, and relatedness are the conditions under which meaningful work flourishes. Without these, even purposeful work becomes draining.

Application to Process Design:

  • Automated systems should augment, not replace, human decision-making
  • Staff should understand why processes exist, not just how to follow them
  • Build in opportunities for skill development as systems evolve

14.2.1.3 Barriers and Remedies

Common human-factor barriers that undermine optimization and efficiency, with practical alternatives:

Barrier Impact Productive Alternative
Invisible impact Work feels meaningless, motivation limited to extrinsic rewards Make outcomes visible; share success stories; connect tasks to beneficiaries; celebrate contributions to mission
Bullying or intimidation Staff silence concerns, errors go unreported, talent leaves Establish psychological safety as a core value; enforce clear anti-bullying policies; provide anonymous reporting channels; train managers to recognize and address harmful behaviors
Poor communication styles Misunderstandings, duplicated effort, delayed decisions Adopt structured formats (SBAR for updates, decision logs for choices); practice active listening; summarize and confirm understanding before acting
Disruptive interpersonal patterns Unpredictable decisions, fear-based culture, high turnover Address observable behaviors rather than speculating on motives; document conduct expectations; use HR processes and governance to protect team function
Leadership lacking domain expertise Misprioritized work, unrealistic timelines, poor resource allocation Pair leaders with subject matter experts; require decision review gates for technical choices; schedule regular domain briefings
Absence of visionary leadership No compelling narrative, team drifts, purpose erodes over time Articulate clear vision; regularly connect work to mission; share impact stories; model commitment
Resistance to change Slow adoption, workarounds, shadow systems Co-design solutions with end users; run small pilots before full rollout; provide training and support; align incentives with adoption

14.2.1.4 Visionary Leadership

Visionary leadership plays a critical role in sustaining meaning and motivation. Leaders who articulate a compelling vision of impact help teams persist through challenges that would otherwise lead to disengagement.

Leadership Behavior Effect on Team
Articulates purpose Connects daily work to mission
Makes impact visible Shares outcomes, success stories, beneficiary feedback
Models commitment Demonstrates personal investment in the work
Celebrates contributions Recognizes how individual efforts advance the mission
Provides context Explains decisions in terms of program goals
NoteCancerSurv Example

The CancerSurv project lead began each sprint review by sharing one story: a cancer cluster identified early, a disparity revealed by the data, or a researcher whose study was enabled by registry completeness. These brief moments connected the team’s technical work to real public health impact, sustaining motivation through difficult implementation challenges.

14.2.1.5 Psychological Safety

Amy Edmondson’s research on psychological safety demonstrates that teams perform better when members feel safe to take risks, ask questions, and admit mistakes without fear of punishment.

Safety Element Process Design Implication
Speaking up Create feedback channels for process improvements
Risk-taking Allow experimentation with PDSA cycles
Mistake tolerance Design systems with error recovery, not just prevention
Help-seeking Document processes so staff can learn independently
NoteCancerSurv Example

When implementing CancerSurv, the state registry created a “no-blame” error reporting system. Registrars could flag data quality issues or workflow problems without fear of criticism. This resulted in:

  • 47% more process improvement suggestions in the first quarter
  • Identification of a critical duplicate detection gap that would have affected NPCR submission
  • Higher staff satisfaction scores (3.2 → 4.1 on 5-point scale)

14.2.1.6 Job Demands-Resources Model

The JD-R model explains that job stress results from imbalance between demands and resources:

Job Demands Job Resources
Workload Autonomy
Time pressure Social support
Complexity Feedback
Ambiguity Growth opportunities

Process optimization should reduce demands while maintaining or increasing resources. Automation that simply increases throughput expectations without adding support leads to burnout.

TipHuman-Factor Guardrails

Design processes that protect well-being:

  • Set realistic throughput targets aligned to available resources
  • Provide recovery mechanisms when errors occur
  • Rotate high-stress tasks to avoid fatigue concentration
  • Include regular check-ins focused on workload and support needs

14.2.2 Sociological Foundations

Organizations are social systems. Process changes must account for group dynamics, power structures, and cultural norms.

14.2.2.1 Organizational Culture (Schein)

Edgar Schein’s model describes three levels of organizational culture:

  1. Artifacts: Visible structures and processes (what we see)
  2. Espoused Values: Stated strategies and goals (what we say)
  3. Basic Assumptions: Unconscious beliefs (what we actually believe)

Process optimization often fails when it addresses only artifacts while conflicting with basic assumptions. A new data system requiring transparency may fail in an organization where the underlying assumption is “information is power.”

14.2.2.2 Barriers and Remedies

Barrier Impact Productive Alternative
Unhealthy competition Information hoarding, territorial behavior, duplicated effort Establish shared goals that require collaboration; use team-based (not individual) incentives; conduct cross-team reviews to surface dependencies
Political factors Decisions driven by influence rather than evidence; resource allocation disconnected from priorities Document decision criteria transparently; maintain decision logs accessible to stakeholders; require conflict-of-interest declarations for major choices
Misalignment across teams Conflicting priorities, duplicated work, wasted resources Use cascading OKRs to connect team goals to organizational objectives; align work to shared logic models; maintain visible roadmaps showing dependencies
Isolation of opposing views Blind spots in planning, groupthink, low trust from excluded parties Practice structured dissent (assign devil’s advocate roles); use red teaming to stress-test plans; ensure facilitation includes diverse perspectives
Informal leadership undermining formal roles Fragmented decision-making, unclear accountability, team confusion Define RACI for all key processes; formally appoint accountable owners; channel leadership energy into collaborative problem-solving rather than competing authority

14.2.2.3 Communities of Practice

Wenger’s Communities of Practice (CoP) concept describes how learning and knowledge sharing occur through social participation:

CoP Element Application to Process Optimization
Domain Shared competence (e.g., data quality)
Community Members who interact and learn together
Practice Shared resources, tools, and approaches

Establish cross-functional communities to prevent silos and ensure process improvements are adopted across the organization.

14.2.2.4 Deduplication of Effort

One of the most significant inefficiencies in organizations is duplicated work: multiple teams solving the same problem independently. This represents wasted resources and inconsistent outcomes.

Problem Solution Mechanism
Multiple teams cleaning the same data Centralized data team Shared Bronze/Silver data layers
Parallel development of similar tools Internal tool registry Searchable repository of existing solutions
Reinventing processes Process documentation Wiki/knowledge base with templates
Redundant meetings Meeting audit Regular review of recurring meetings
TipCentralization Principle

Core functions should be centralized. If multiple groups rely on the same data, that data should be centrally stored, cleaned, and documented. This ensures consistency, reduces duplicate effort, and allows specialized expertise to develop.

Examples of centralizable functions:

  • Data cleaning and standardization
  • Report generation and dissemination
  • Training material development
  • Compliance documentation
WarningSilo Risks

Deduplication fails when:

  • Goals and definitions differ across teams
  • Shared repositories lack findability or curation
  • Incentives reward local optimization over system impact

Remedies: establish taxonomy and metadata standards, maintain an internal tool registry, and set a review cadence for shared assets.

14.2.3 Project Management

Effective process optimization requires disciplined project management. However, organizations often struggle not just with which framework to use, but with fundamental questions about who should manage projects and what project management actually entails.

14.2.3.1 The Project Manager Role

One of the most significant inefficiencies in public health and research organizations is role confusion between project managers, subject matter experts (SMEs), and leaders. Each role serves a distinct function, but organizations frequently conflate them, leading to wasted expertise and frustrated professionals.

Role Primary Function Core Skills Decision Authority
Project Manager Coordinate execution Planning, scheduling, risk management, stakeholder communication How and when work gets done
Subject Matter Expert Provide technical direction Domain expertise, scientific judgment, methodological rigor What work should be done and why
Leader Set vision and strategy Inspiring others, resource allocation, organizational navigation Where the program is going

Common Anti-Patterns:

Anti-Pattern What Happens Impact
SME as PM Scientists or epidemiologists spend time scheduling meetings, tracking tasks, and chasing deliverables Domain expertise wasted on administration; scientific work suffers; SME frustration and burnout
PM as SME Project managers attempt to make technical decisions or navigate scientific nuances they do not understand Poor technical choices; SMEs must re-explain concepts repeatedly; project delays from misunderstanding requirements
PM as Leader Project managers expected to set strategic direction without authority or organizational perspective Scope confusion; PMs blamed for decisions outside their control; actual leaders abdicate responsibility
Leader without domain grounding Executives set direction without understanding technical constraints or scientific realities Unrealistic expectations; misprioritized work; staff demoralization when told to “just make it happen”
ImportantThe Expertise Investment Problem

Scientific and technical expertise takes years to develop. When organizations require SMEs to perform project management tasks, they effectively pay expert rates for administrative work that could be done by dedicated coordinators. Simultaneously, the scientific work that only the SME can do goes undone or is rushed.

The math is stark: An epidemiologist spending 20 hours per week on project coordination is not doing 20 hours of epidemiology. That expertise gap cannot be filled by anyone else on the team.

Productive Role Separation:

Function Assigned To Examples
Scheduling and logistics Project Manager Meeting coordination, timeline tracking, resource scheduling
Stakeholder communication Project Manager (with SME input) Status updates, risk escalation, expectation management
Technical decisions SME Methodology selection, data interpretation, scientific quality
Strategic direction Leader (often SME or Sponsor) Program priorities, resource allocation, vision articulation
Process improvement Collaborative PM identifies bottlenecks; SME validates solutions; Leader authorizes changes
NoteCancerSurv Example

The CancerSurv implementation initially struggled because the lead epidemiologist was expected to manage the project while also defining data quality standards, training registrars, and conducting analyses. After three months of missed deadlines and staff frustration, the team restructured:

  • Project Manager: Hired a dedicated PM to handle vendor coordination, sprint planning, and stakeholder reporting
  • Lead Epidemiologist: Focused on data quality specifications, NAACCR standards compliance, and analytic methodology
  • Program Director: Provided strategic direction, secured resources, and resolved organizational barriers

Result: The epidemiologist’s scientific output increased by 60%, sprint velocity improved by 40%, and team satisfaction scores rose from 2.8 to 4.2 (on a 5-point scale).

TipRight-Sizing Project Management

Not every project needs a full-time PM. Consider these models:

  • Small projects: SME leads with lightweight PM support (administrative assistant, shared coordinator)
  • Medium projects: Dedicated part-time PM or PM supporting multiple related projects
  • Large/complex projects: Full-time PM with clear authority over coordination; SMEs retained for technical direction
  • Programs: Program manager coordinates across projects; individual PMs or coordinators for each workstream

The key is ensuring someone is responsible for coordination so that SMEs can focus on what only they can do.

14.2.3.2 Project Management Frameworks

Choose frameworks appropriate to your context. The framework matters less than consistent application and clear role definition.

Framework Best For Key Features
Agile/Scrum Evolving requirements, software Iterative, adaptive, team-based
Kanban Continuous flow, operations Visual, WIP limits, pull-based
Waterfall Fixed requirements, compliance Sequential, documented, predictable
Hybrid Public health programs Grant milestones + iterative delivery

14.2.3.3 Project Management Tools

Capability Commercial Options OSS/PH Options
Full PM Suite Jira, Azure DevOps, MS Project OpenProject, Taiga
Kanban Boards Trello (paid), Monday.com Trello (free tier), Wekan
Task Management Asana, ClickUp Nextcloud Tasks, GitHub Issues
Communication Slack, MS Teams Mattermost, Zulip
ImportantTool Selection Criteria

When selecting project management tools:

  1. Visibility: Everyone should know goals, objectives, roles, and responsibilities
  2. Accountability: Tasks should have clear owners and deadlines
  3. Integration: Tools should connect to reduce manual status updates
  4. Accessibility: All team members should be able to access and use the system
  5. Data sovereignty: Consider where project data is stored, especially for sensitive public health programs

14.2.3.4 The Iterative Imperative

Avoid spending months developing systems that do not meet critical needs. Systems should be iteratively developed with priority functionality assessed at each step.

Anti-Pattern Better Approach
12-month development before user feedback 2-week sprints with demos
Complete feature set before release MVP with core functionality first
Comprehensive documentation before coding Just-in-time documentation
Perfect architecture upfront Evolutionary architecture
flowchart LR
    A[Identify Priority<br/>Functionality] --> B[Develop<br/>Increment]
    B --> C[Deploy &<br/>Gather Feedback]
    C --> D{Delivering<br/>Value?}
    D -->|Yes| E[Continue &<br/>Expand]
    D -->|No| F[Pivot or<br/>Discontinue]
    E --> A
    F --> G[Lessons<br/>Learned]
Figure 14.2: Iterative Development with Value Assessment

14.2.3.5 Barriers and Remedies

Barrier Impact Productive Alternative
Poor leadership Scope drift, low morale, unclear direction Establish clear project charter with defined authority; provide leadership coaching; implement governance oversight with regular check-ins
No clear goals Scattered effort, difficulty measuring progress, weak demonstrated value Define measurable outcomes upfront; adopt OKRs connecting daily work to strategic objectives; align deliverables to logic model outcomes
Disorganization Missed deadlines, rework, lost information Implement lightweight rituals (standups, retrospectives); use visual management (Kanban boards); establish regular planning and review cadence
Misalignment across workstreams Conflicting priorities, duplicated effort, integration failures Conduct cross-functional prioritization sessions; maintain a single prioritized backlog; map and communicate dependencies explicitly
Overemphasis on aesthetics Delayed value delivery, resources diverted from core function Prioritize scalability and reliability first; defer visual polish until core functionality proven; measure success by outcomes, not appearance
Unrealistic expectations Burnout, quality shortcuts, failed delivery, eroded trust Use evidence-based estimation (historical velocity, throughput data); align commitments to actual capacity; set SLAs based on demonstrated capability
Shifting expectations (scope creep) Constant rework, team churn, inability to complete anything Implement formal change control process; refine backlog regularly with stakeholders; document scope decisions; re-baseline schedules with explicit stakeholder sign-off when scope changes

Note: Role confusion between PMs, SMEs, and leaders is addressed in detail in The Project Manager Role section above.

14.2.4 Communication and Transparency

Clear communication is the foundation of organizational efficiency. Teams cannot avoid duplicating work if they do not know what others are doing.

14.2.4.1 Communication Hierarchy

Level Content Frequency Tool
Strategic Goals, priorities, resource allocation Quarterly All-hands, leadership memo
Tactical Project status, blockers, decisions Weekly Team meetings, PM tool
Operational Task updates, questions, collaboration Daily Chat, task comments
Ad-hoc Urgent issues, clarifications As needed Direct message, huddle

14.2.4.2 RACI Matrix for Process Clarity

Define roles clearly to prevent gaps and overlaps:

Role Definition
Responsible Does the work
Accountable Ultimately answerable (one person only)
Consulted Provides input
Informed Kept updated
NoteCancerSurv Example

RACI for Data Quality Process:

Activity Data Analyst Data Manager Epidemiologist IT Support
Receive hospital files I R I C
Validate file format C R I A
Clean demographic data R C I I
Apply edits/business rules R C A I
Generate quality report R I A I
Resolve data issues R C A C

14.2.4.3 Poor Styles and Productive Alternatives

Poor Style Typical Outcome Productive Alternative
Vague updates (“Things are going fine”) Confusion about actual status, surprises, rework Use SBAR format: Situation (what’s happening), Background (context), Assessment (analysis), Recommendation (proposed action)
One-way broadcasting (announcements without dialogue) Low engagement, unaddressed concerns, passive resistance Practice active listening; explicitly solicit questions; summarize decisions and confirm understanding
Unstructured meetings (no agenda, no outcomes) Wasted time, repeated discussions, unclear decisions Publish agenda with specific objectives; timebox each topic; end with documented decisions and assigned next steps
Email-only coordination for complex work Slow responses, fragmented context, lost threads Use shared PM tools for task tracking; maintain threaded discussions tied to work items; keep decision logs for reference
Public criticism of individuals Fear, defensiveness, hidden problems Provide feedback privately; focus on specific behaviors rather than character; reinforce expected norms constructively

14.2.5 Educational Foundations

Process optimization requires ongoing learning and skill development.

14.2.5.1 Adult Learning Principles (Andragogy)

Malcolm Knowles identified key principles for adult learners:

Principle Application
Self-directed Provide resources for independent learning
Experience-based Connect new processes to existing knowledge
Relevance-oriented Explain why processes matter
Problem-centered Frame training around real challenges
Internally motivated Appeal to professional growth, not just compliance

14.2.5.2 Building Technical Capacity

Desktop data management tasks should be scripted. Manual data cleaning in spreadsheets is error-prone, non-reproducible, and does not scale.

Manual Approach Scripted Approach
Copy-paste in Excel R/Python script
Point-and-click transformations Documented code
“I remember how I did it” Version-controlled workflow
One person can do it Anyone can run it
TipLearning Path Recommendation

Encourage all analysts to learn basic programming:

  1. Start with R or Python: Both are free, well-documented, and widely used
  2. Focus on data manipulation first: tidyverse (R) or pandas (Python)
  3. Learn version control: Git fundamentals for collaboration
  4. Build incrementally: Automate one task at a time
  5. Share and document: Create institutional knowledge

Recommended resources:

14.2.5.3 Barriers and Remedies

Barrier Impact Productive Alternative
Leadership lacks domain expertise Misguided priorities, unrealistic technical decisions Pair leaders with SMEs for technical decisions; schedule regular domain briefings; require review gates before committing to technical approaches
Training deprioritized Persistent skill gaps, reliance on few experts, fragility Invest in microlearning (short, focused modules); build practice into regular work; establish mentorship programs pairing experienced and developing staff
Tool resistance (“I’ve always done it this way”) Manual work persists despite better options, inconsistent outputs Provide low-stakes practice environments; pair resistant staff with supportive peers; share concrete success stories showing time saved
Knowledge silos Rework when experts unavailable, inconsistent methods across team Develop shared curricula documenting standard approaches; host brown-bag sessions for knowledge sharing; consider internal certifications to validate and spread expertise

14.3 Automation Strategy

14.3.1 The Automation Spectrum

Not all automation is equal. Consider the level appropriate for each process:

Level Description Example Human Role
Manual Human does all work Ad-hoc data requests Full control
Assisted Tools support human work Spell-check, templates Decision-maker
Partial System handles routine; human handles exceptions Auto-coding with review queue Exception handler
Conditional System does most; human monitors Scheduled reports with alerts Supervisor
Full System operates independently Automated backups Oversight only

14.3.2 Automation with Accountability

Systems that are automated sometimes miss critical requirements. Automated systems should be flexible enough to adjust to evolving demands.

Requirement Implementation
Flexibility Configurable rules, not hard-coded logic
Auditability Logging of all automated decisions
Override capability Human can intervene when needed
Feedback loops Mechanism to report automation failures
Version control Track changes to automation rules

14.3.2.1 The AI Accountability Challenge

Artificial intelligence presents unique challenges for process automation:

AI Challenge Mitigation Strategy
Lack of accountability Assign human owner for AI-assisted decisions
Inconsistent reliability Implement confidence thresholds and fallback processes
Volume of output Train more reviewers; focus review on high-risk items
Opacity (“black box”) Require explainable AI or human decision for critical paths
Drift over time Regular performance monitoring and recalibration
WarningAI Oversight Principle

There should always be someone who reviews critical operations. AI can accelerate work, but lacks the accountability and judgment required for consequential decisions.

The challenge: humans cannot review the vast amounts of AI-generated output. Solutions include:

  • Risk-based review: Focus human attention on high-stakes decisions
  • Sampling: Statistically valid review of AI output subsets
  • Building reviewer capacity: Train more staff to critically evaluate AI outputs
  • Automated validation: Use rule-based systems to catch obvious AI errors

14.3.2.2 Barriers and Remedies

Barrier Impact Productive Alternative
Resistance to tool adoption Parallel manual processes undermine efficiency gains, inconsistent outputs Co-design automation with end users from the start; pilot features with willing early adopters; identify champions and super-users to support peers
Rigid automation (hard-coded logic) System cannot adapt to legitimate exceptions, workarounds proliferate Build configurable rules rather than fixed logic; ensure human override capability; plan for rapid iteration as requirements evolve
Opaque AI decisions Low trust, reluctance to rely on system, manual double-checking negates efficiency Require explainability for consequential decisions; implement confidence thresholds that trigger human review; maintain fallback to human judgment for edge cases
Over-automation (removing humans entirely) Brittleness when exceptions occur, catastrophic failures without intervention Keep humans in the loop for exception handling; monitor automated systems actively; recalibrate regularly based on performance data

14.3.3 Scripting Desktop Tasks

Transform manual data management into reproducible workflows:

flowchart LR
    subgraph Manual["Manual Process"]
        M1[Open Excel] --> M2[Copy data]
        M2 --> M3[Clean manually]
        M3 --> M4[Format output]
        M4 --> M5[Email results]
    end
    
    subgraph Scripted["Scripted Process"]
        S1[Run script] --> S2[Auto-clean]
        S2 --> S3[Generate report]
        S3 --> S4[Archive & notify]
    end
    
    Manual -->|Transform| Scripted
Figure 14.3: From Manual to Scripted Data Workflow

Benefits of scripted workflows:

Benefit Description
Reproducibility Same code produces same results
Auditability Code documents exactly what was done
Scalability Process 10 or 10,000 records identically
Error reduction Eliminates copy-paste mistakes
Knowledge transfer New staff can run existing scripts

14.4 Informatics Systems: Enabler or Barrier

Informatics systems are the backbone of modern public health operations. When well-designed and implemented, they accelerate every aspect of program delivery. When poorly conceived or executed, they become obstacles that consume resources, frustrate staff, and ultimately harm the populations they were meant to serve.

14.4.1 The Dual Nature of Informatics

The same system can be an enabler or barrier depending on design choices, implementation quality, and organizational context:

Dimension Informatics as Enabler Informatics as Barrier
Data access Self-service queries empower analysts Locked-down systems require IT tickets for every request
Workflow integration Systems fit existing work patterns Staff must adapt to rigid, unintuitive interfaces
Interoperability Standards-based data exchange (HL7, FHIR) Proprietary formats create data silos
Scalability Architecture handles growth and surges Systems buckle under increased load
Adaptability Configurable rules accommodate change Hard-coded logic requires vendor involvement
Usability Intuitive design reduces training burden Complex interfaces increase error rates
Documentation Clear specifications enable troubleshooting Undocumented systems create key-person dependencies
Maintenance Modular design allows incremental updates Monolithic systems require risky big-bang changes

14.4.2 Design Principles for Enabling Systems

Informatics systems that enable efficiency share common characteristics:

14.4.2.1 User-Centered Design

Principle Implementation
Involve end users early Conduct contextual inquiry; observe actual workflows before designing
Prototype iteratively Show working software, not just specifications; gather feedback continuously
Minimize cognitive load Reduce clicks, eliminate redundant data entry, provide smart defaults
Design for errors Assume mistakes happen; make recovery easy; prevent catastrophic actions
Support diverse users Accommodate varying technical skill levels; provide progressive complexity

14.4.2.2 Technical Architecture

Principle Implementation
Modular design Separate concerns; allow components to be updated independently
API-first approach Expose functionality through well-documented interfaces
Standards compliance Adopt industry standards (HL7 FHIR, USCDI) for interoperability
Scalable infrastructure Design for 10x expected load; plan for surge capacity
Security by design Build authentication, authorization, and audit into the foundation
NoteCancerSurv Example

The CancerSurv system was designed with a clear separation between:

  • Data layer: Centralized data warehouse with Bronze/Silver/Gold architecture
  • Business logic layer: Configurable rules engine for edits and validations
  • Presentation layer: Role-based interfaces for registrars, epidemiologists, and administrators
  • Integration layer: HL7 FHIR APIs for hospital and lab connectivity

This modular approach allowed the team to update the duplicate detection algorithm without touching case abstraction screens, and to add new hospital integrations without modifying the core data model.

14.4.3 Common Informatics Barriers

Recognize and address these patterns that transform systems into obstacles:

14.4.3.1 System-Level Barriers

Barrier Impact Remedy
Legacy system lock-in Cannot adopt modern approaches; vendor dependency Plan migration paths; negotiate data portability; avoid proprietary formats
Technical debt accumulation Increasing fragility; slower development; higher maintenance costs Allocate time for refactoring; track and prioritize debt reduction
Integration failures Data silos persist; manual reconciliation required Invest in middleware; adopt standards; establish data governance
Performance degradation Staff workarounds; delayed operations; missed deadlines Monitor proactively; plan capacity; optimize before crisis
Security vulnerabilities Data breaches; compliance failures; loss of public trust Regular assessments; patch management; security-aware development

14.4.3.2 Organizational Barriers

Barrier Impact Remedy
Misaligned procurement Systems selected for features not used; actual needs unmet Involve end users in selection; weight usability heavily; pilot before commitment
Insufficient training Powerful features unused; errors from misunderstanding Budget for training; provide ongoing support; create user communities
Change resistance New systems underutilized; parallel manual processes persist Co-design with users; demonstrate value; provide transition support
Vendor over-dependence Slow response to needs; high costs for changes; strategic constraints Maintain internal expertise; document configurations; plan exit strategies
Governance gaps No clear ownership; conflicting priorities; stalled decisions Assign system owners; establish steering committees; define decision rights

14.4.4 Informatics Maturity Model

Assess your organization’s informatics capability to identify improvement priorities:

Level Characteristics Indicators
1. Ad-hoc No formal systems; spreadsheet-based; individual solutions Data inconsistency; key-person dependencies; no audit trail
2. Defined Standard tools selected; basic documentation; some training Consistent formats; documented procedures; identifiable system owners
3. Managed Integrated systems; performance monitoring; governance structures Automated workflows; SLAs tracked; regular reviews
4. Optimized Continuous improvement; advanced analytics; predictive capabilities Data-driven decisions; proactive issue detection; measurable outcomes
5. Transformative Systems enable new capabilities; strategic asset; innovation driver New programs enabled; cross-program insights; national leadership
TipMaturity Assessment

Most public health programs operate between levels 2 and 3. Advancing maturity requires sustained investment in:

  • People: Training, hiring, retention of informatics talent
  • Process: Governance, standards, continuous improvement practices
  • Technology: Modern architecture, interoperability, maintainability

Focus on foundational capabilities before pursuing advanced features. A well-implemented level 3 system delivers more value than a poorly implemented level 4 system.

14.4.5 Barriers and Remedies Summary

Barrier Impact Productive Alternative
Poorly designed systems User frustration, workarounds, data quality issues, wasted effort Apply user-centered design; prototype iteratively; conduct usability testing; involve end users throughout
Legacy system constraints Inability to modernize, vendor lock-in, accumulating technical debt Plan migration incrementally; negotiate data portability; maintain internal expertise; document thoroughly
Interoperability failures Data silos, manual reconciliation, incomplete picture, duplicated effort Adopt standards (HL7 FHIR, USCDI); invest in integration infrastructure; establish data governance
Insufficient informatics capacity Over-reliance on vendors, slow response to needs, inability to optimize Build internal team; provide career paths; partner with academic programs; cross-train existing staff
Misaligned system selection Features unused, actual needs unmet, costly customization Involve users in procurement; weight usability; pilot before committing; define requirements clearly

14.5 Centralization and Shared Services

14.5.1 The Medallion Architecture for Shared Data

When multiple teams rely on the same data, implement a centralized data architecture:

flowchart LR
    subgraph Sources["Data Sources"]
        H[Hospitals]
        L[Labs]
        V[Vital Records]
    end
    
    subgraph Central["Centralized Data Team"]
        B[(Bronze<br/>Raw Data)]
        S[(Silver<br/>Cleaned Data)]
        G[(Gold<br/>Analysis-Ready)]
    end
    
    subgraph Consumers["Data Consumers"]
        E[Epidemiologists]
        R[Registrars]
        A[Analysts]
        P[Program Staff]
    end
    
    H --> B
    L --> B
    V --> B
    B --> S
    S --> G
    G --> E
    G --> R
    G --> A
    G --> P
Figure 14.4: Centralized Data Architecture

Centralization benefits:

  • Consistency: All consumers use the same cleaned data
  • Expertise: Data team develops specialized cleaning skills
  • Efficiency: Clean once, use many times
  • Quality: Single point of accountability for data quality

14.5.2 Shared Services Model

Extend centralization beyond data to other core functions:

Function Centralized Model Benefits
Data cleaning Central data team maintains Silver layer Consistent quality, reduced duplication
Report generation Shared reporting infrastructure Standard formats, automated distribution
Training development Central learning team Consistent messaging, professional quality
Compliance documentation Compliance office coordinates Complete coverage, expert interpretation
Tool administration IT manages shared tools Security, licensing, support

14.6 Establishing Accountability Structures

14.6.1 Clear Governance

Clear structures should be in place to ensure responsibility and accountability. Without governance, process improvements drift and deteriorate.

Governance Element Purpose Example
Process owner Single point of accountability Data Quality Manager
Steering committee Strategic decisions, resource allocation Monthly leadership review
Working groups Operational improvements Data Quality Working Group
SLAs/OLAs Documented expectations 99.9% uptime, 48-hour response
Escalation paths Clear routes for unresolved issues Analyst → Manager → Director

14.6.2 Metrics and Monitoring

What gets measured gets managed. Establish metrics for process performance:

Metric Category Examples
Efficiency Time per task, throughput, backlog size
Quality Error rates, rework rates, audit findings
Adoption Usage rates, training completion, feedback scores
Value Cost savings, time saved, outcomes improved
NoteCancerSurv Example

Process Optimization Dashboard:

Metric Baseline Target Current Status
Case abstraction time 15 min 8 min 9.2 min 🟡
Data completeness 89% 95% 94.3% 🟡
Duplicate detection rate 73% 95% 96.1% 🟢
Hospital submission lag 14 days 7 days 5.2 days 🟢
Manual interventions/week 127 50 43 🟢
Staff satisfaction 3.2/5 4.0/5 4.1/5 🟢

The dashboard is reviewed weekly by the Data Quality Working Group and monthly by the program steering committee.

14.6.2.1 Barriers and Remedies

Barrier Impact Productive Alternative
Political interference Decision volatility, resources redirected without justification, staff demoralization Establish charter-based governance with documented authority; use transparent criteria for decisions; engage external audit for high-stakes or contentious choices
Ambiguous accountability Tasks dropped between roles, finger-pointing when problems arise Assign explicit process owners with documented responsibility; define SLAs/OLAs for handoffs; establish clear escalation paths when issues arise
Goal drift Original value proposition lost, effort disconnected from outcomes Conduct quarterly goal reviews against original objectives; maintain traceability from daily work to logic model outcomes and grant requirements
Fragmented oversight Conflicting directives from multiple authorities, staff confusion Consolidate to single steering committee with clear authority; integrate calendars to prevent conflicting demands; establish unified reporting cadence

14.7 Bringing It Together: A Comprehensive Framework

14.7.1 The Process Optimization Lifecycle

flowchart TD
    A[Identify Process] --> B[Assess Value]
    B --> C{Mission<br/>Critical?}
    C -->|No| D[Eliminate or<br/>Reduce]
    C -->|Yes| E[Map Current State]
    E --> F[Identify Inefficiencies]
    F --> G[Design Improvements]
    G --> H[Pilot & Test]
    H --> I{Effective?}
    I -->|No| G
    I -->|Yes| J[Implement]
    J --> K[Monitor & Measure]
    K --> L{Meeting<br/>Targets?}
    L -->|No| F
    L -->|Yes| M[Standardize & Document]
    M --> N[Continuous Monitoring]
    N --> F
Figure 14.5: Process Optimization Lifecycle

14.7.2 Integration Checklist

When optimizing processes, ensure you address all dimensions:

Dimension Questions to Ask
Psychology Does this support autonomy, competence, and relatedness? Is the environment psychologically safe?
Sociology Does this align with organizational culture? Are communities of practice engaged?
Project Management Is there a clear plan with milestones? Are tools appropriate?
Communication Do all stakeholders know the goals, roles, and status?
Education Do staff have the skills needed? Is training available?
Centralization Are core functions appropriately centralized? Is duplication minimized?
Automation Is the right level of automation applied? Is there human oversight?
Accountability Are governance structures clear? Are metrics tracked?

14.7.3 The Public Health Value Proposition

Ultimately, process optimization in public health must demonstrate value to the taxpayer. Every efficiency gain should connect to improved health outcomes:

Efficiency Gain Taxpayer Value
Faster data processing Earlier outbreak detection
Reduced manual errors More accurate surveillance
Automated reporting More resources for intervention
Streamlined workflows Lower cost per case processed
Better data quality More reliable public health decisions
ImportantThe Accountability Test

For every process, ask: “Could I explain to a taxpayer why this is necessary and how it contributes to public health?”

If the answer is no, the process should be eliminated, automated, or fundamentally redesigned.

14.8 Visual Framework: Factors Influencing Program Efficiency

The following directed acyclic graph (DAG) synthesizes the key enablers and barriers discussed throughout this chapter, illustrating how foundational factors flow through organizational processes to ultimately affect public health program impact.

flowchart TD
    %% LAYER 1: Foundations
    PSY[Psychological<br/>Foundations]
    SOC[Sociological<br/>Foundations]
    EDU[Educational<br/>Foundations]

    %% LAYER 2: Enablers and Barriers side by side
    subgraph Layer2[" "]
        direction LR
        subgraph Enablers[Enabling Factors]
            E1[Governance]
            E2[Communication]
            E3[Iteration]
            E4[Centralization]
            E5[Automation]
            E6[Visionary Leadership]
            E7[Visible Impact]
            E8[Enabling Informatics]
        end
        subgraph Barriers[Barriers]
            B1[Intimidation]
            B2[Politics]
            B3[Silos]
            B4[Duplication]
            B5[Rigidity]
            B6[Ambiguity]
            B7[Scope Creep]
            B8[Invisible Impact]
            B9[Poor Informatics]
        end
    end

    %% LAYER 3: Processes
    subgraph Processes[Organizational Processes]
        direction LR
        P1[Project Mgmt]
        P2[Change Mgmt]
        P3[Data Mgmt]
        P4[Quality Improvement]
    end

    %% LAYER 4: Outcomes
    subgraph Outcomes[Intermediate Outcomes]
        direction LR
        O1[Efficiency]
        O2[Data Quality]
        O3[Well-being]
        O4[Adoption]
        O5[Meaning]
    end

    %% LAYER 5: Impact
    IMPACT([Program Impact &<br/>Taxpayer Value])

    %% Foundations to Enablers
    PSY --> E1 & E2 & E6 & E7
    SOC --> E2 & E4
    EDU --> E3 & E5 & E8

    %% Foundations to Barriers (when weak)
    PSY -.-> B1 & B8
    SOC -.-> B2 & B3
    EDU -.-> B4 & B9

    %% Enablers to Processes
    E1 --> P1 & P4
    E2 --> P1 & P2
    E3 --> P1
    E4 --> P3
    E5 --> P3 & P4
    E6 --> P2
    E7 --> P2 & P4
    E8 --> P1 & P3 & P4

    %% Barriers block Processes
    B1 -.-> P2
    B2 -.-> P1
    B3 -.-> P3
    B4 -.-> P1
    B5 -.-> P3
    B6 -.-> P1
    B7 -.-> P1
    B8 -.-> P2 & P4
    B9 -.-> P1 & P3 & P4

    %% Processes to Outcomes
    P1 --> O1
    P2 --> O3 & O4 & O5
    P3 --> O2
    P4 --> O1 & O2

    %% Outcomes to Impact
    O1 & O2 & O3 & O4 & O5 --> IMPACT

    %% Styling
    classDef foundation fill:#e1f5fe,stroke:#01579b,stroke-width:2px
    classDef enabler fill:#e8f5e9,stroke:#2e7d32,stroke-width:2px
    classDef barrier fill:#ffebee,stroke:#c62828,stroke-width:2px
    classDef process fill:#fff3e0,stroke:#ef6c00,stroke-width:2px
    classDef outcome fill:#f3e5f5,stroke:#7b1fa2,stroke-width:2px
    classDef impact fill:#1565c0,stroke:#0d47a1,stroke-width:3px,color:#fff
    classDef invisible fill:none,stroke:none

    class PSY,SOC,EDU foundation
    class E1,E2,E3,E4,E5,E6,E7,E8 enabler
    class B1,B2,B3,B4,B5,B6,B7,B8,B9 barrier
    class P1,P2,P3,P4 process
    class O1,O2,O3,O4,O5 outcome
    class IMPACT impact
    class Layer2 invisible
Figure 14.6: DAG: Supporting Factors and Barriers to Public Health Program Optimization

14.8.1 Reading the DAG

The diagram illustrates causal pathways between factors affecting public health program optimization:

Layer Color Components
Foundations Blue Psychological (meaning, SDT, safety, JD-R, leadership), Sociological (culture, CoPs, power), Educational (andragogy, capacity, knowledge)
Enablers Green Governance, Communication, Iteration, Centralization, Automation, Visionary Leadership, Visible Impact, Enabling Informatics
Barriers Red Intimidation, Politics, Silos, Duplication, Rigidity, Ambiguity, Scope Creep, Invisible Impact, Poor Informatics
Processes Orange Project Management, Change Management, Data Management, Quality Improvement
Outcomes Purple Efficiency, Data Quality, Well-being, Adoption, Meaning
Impact Dark Blue Program Impact & Taxpayer Value

Arrow types:

  • Solid arrows (→) represent positive causal pathways where factors support or enable downstream elements
  • Dashed arrows (⇢) represent negative pathways where foundational weaknesses create barriers that undermine processes
TipUsing This Framework

This DAG can serve as a diagnostic tool when assessing program optimization opportunities:

  1. Assess foundations: Are psychological safety, healthy culture, and technical capacity in place?
  2. Inventory enablers: Which enabling factors are present? Which are missing?
  3. Identify barriers: Which barriers are actively impeding progress?
  4. Trace pathways: Follow the arrows to understand how barriers affect outcomes
  5. Prioritize interventions: Address foundational gaps and high-impact barriers first

14.9 Summary

Process optimization requires a multidisciplinary approach that addresses both technical systems and human factors:

  1. Understand human factors: Motivation, psychological safety, and well-being are prerequisites for sustainable improvement. Processes that ignore people will fail.

  2. Address organizational dynamics: Culture, power structures, and communication patterns can enable or obstruct change. Work with these forces, not against them.

  3. Apply disciplined management: Iterative development, clear accountability, and appropriate tools prevent wasted effort. Set realistic expectations and manage scope actively.

  4. Automate thoughtfully: Prioritize scalability and flexibility over perfection. Maintain human oversight, especially for AI-assisted processes that lack inherent accountability.

  5. Centralize core functions: Reduce duplication by sharing data, tools, and expertise. Build specialized capability where it matters most.

  6. Demonstrate value: Connect every optimization to measurable outcomes. In public health, that means taxpayer value and improved health for the populations we serve.

The goal is not efficiency for its own sake, but rather maximizing public health impact with the resources entrusted to us.