JOB SUMMARY
The Central Health System's Data Integration Engineer designs, develops, implements, enhances, and supports enterprise integrations, APIs, and cloud data pipelines across on-premise and cloud environments. Primary platforms include Boomi Integration (iPaaS) and Boomi Data Pipelines for application and data workflows; Snowflake for cloud data warehousing; and Microsoft Azure services for orchestration, compute, and API management. The role enables interoperability using standards such as RESTful APIs, HL7 v2.x messaging, X12 EDI claims transactions (e.g., 270/271, 276/277, 834, 835, 837, 999/TA1), and FHIR APIs. The engineer ensures secure, reliable, and compliant data exchange among internal systems (EHR, revenue cycle, operations) and external partners (payers, state agencies, and vendors).
This role is central to the organization's modern data platform strategy, including event-driven integration patterns, medallion/lakehouse data architecture in Snowflake and Azure Data Lake Storage Gen2, AI/ML data pipeline readiness, and compliance with CMS interoperability mandates. The engineer partners closely with architects, data scientists, clinical informatics teams, and external vendors to deliver trusted, high-quality data products across the system.
Essential Functions
-Boomi Integration Design & Development: Design, build, and maintain enterprise integrations using Boomi Integration (iPaaS) — including process workflows, connector configuration (HTTP/S, SQL, SFTP, HL7, EDI), decision logic, and robust error handling and retry patterns.
-Epic EHR & Clinical System Integrations: Develop and support Epic-sourced integrations with diagnostic machines, medical devices, lab systems, and other clinical applications — including ADT, ORU, SIU, ORM, VXU, and DFT message flows. Serve as the primary technical resource for Epic interface build and troubleshooting.
-Vendor & Partner Data Integrations: Build and maintain integrations with external vendors, payers, state agencies, and third-party data sources — including inbound/outbound file-based (SFTP, flat file, CSV, JSON, XML) and APIbased exchange patterns. Coordinate with vendor technical contacts for onboarding and issue resolution.
-Healthcare Interoperability (HL7, X12 EDI, FHIR): Develop, parse, validate, and transform HL7 v2.x messages and X12 EDI transactions (270/271, 276/277, 834, 835, 837, 999/TA1). Design and support FHIR R4 APIs — including Patient Access, Provider Directory, and Payer-to-Payer exchange — aligned with CMS interoperability mandates and SMART on FHIR authorization.
-RESTful API Development & Management: Design, implement, and maintain secure RESTful APIs. Enforce standards for versioning, authentication (OAuth 2.0/JWT), rate limiting, and error handling. Manage APIs through Azure API Management. Create and maintain OpenAPI/Swagger specifications.
-Event-Driven Integration: Design and implement event-driven integration patterns using Azure Event Hubs and/or Service Bus to support near-real-time clinical and operational data exchange between systems. Evaluate Apache Kafka / Confluent for use cases requiring true streaming.
-Boomi Atom & Environment Management: Configure, deploy, and manage Boomi Atoms and Atom Clouds across environments (dev, test, prod). Maintain environment promotion pipelines using CI/CD practices (Azure DevOps / GitHub Actions) and adhere to version control and release management standards.
-Data Landing to Snowflake & Azure: Develop and maintain integration pipelines that reliably land and stage data in Snowflake (via Snowpipe, external stages, and Streams) and Azure Data Lake Storage Gen2, in alignment with data structures and contracts defined by the Data Engineering team.
-Integration Security & HIPAA Compliance: Ensure all integrations comply with HIPAA, CMS interoperability rules, and organizational security policies. Manage credentials and secrets via Azure Key Vault. Apply zero-trust principles, including private endpoints and VNet integration, for pipelines carrying PHI. Implement PHI deidentification where required for analytics use cases.
-Error Handling, Monitoring & Alerting: Implement comprehensive error detection, exception handling, and alerting across all Boomi-based integrations. Configure monitoring dashboards and threshold-based alerting using Boomi AtomSphere monitoring and Azure Monitor. Maintain SLA commitments for data freshness and delivery.
-Incident Response & Advanced Support: Serve as the escalation point for integration and API incidents. Conduct root cause analysis, implement fixes, and communicate resolution status to stakeholders. Provide on-call support when required for critical integration failures.
-Quality Assurance & Data Validation: Perform unit testing, peer code reviews, and integration testing for all developed solutions. Implement data validation and reconciliation checks at integration boundaries. Adhere to team coding standards and maintain version control for all integration assets.
-Technical Documentation: Create and maintain integration specifications, interface control documents, message mapping guides, sequence diagrams, and operational runbooks in the team knowledge repository (Confluence or equivalent). Keep documentation current with every change.
-Cross-Functional Collaboration: Partner with architects, Data Engineering, clinical informatics, QA, PMO, and business stakeholders to define integration requirements and deliver solutions. Coordinate handoff points and data contract agreements with the Data Engineering team. Communicate risks, issues, and dependencies proactively.
-Vendor Coordination & Oversight: Assist with technical relationships with integration vendors, interface engine vendors, and payer/partner technical contacts. Participate in onboarding, scoping, and troubleshooting of thirdparty integration components. Escalate vendor performance issues as needed.
MINIMUM EDUCATION:
-Bachelors degree in computer science or a related field OR Associate's degree in computer science or a related field with 2 years of additional experience OR High school diploma/equivalent with 4 years of additional experience
REQUIRED EXPERIENCE:
-1 year in Healthcare IT
-1 year as an application, database, and/or integration developer
-1 year with an integration platform (Boomi, Mirth Connect, Cloverleaf, MuleSoft, or equivalent)
-1 year with ETL platforms such as Pentaho, Informatics, Talend, and/or SSIS
-1 year with at least one healthcare interoperability starndard: HL7 v2.x, X12 EDI, or FHIR APIs
-2 years with SQL
REQUIRED LICENSURE/CERTIFICATIONS:
-EPIC certification (within 6 months of hire)
-Boomi Professional Developer certification (within 6 months of hire)
Software Powered by iCIMS
www.icims.com