Save Job Back to Search Job Description Summary Similar JobsGreat work life balance platform with WFH policy and 20 days ALs.Join a high‑impact team building the firm's next‑gen enterprise data platform.About Our ClientOur client is a medium-sized organization specializing in the insurance industry, with a strong focus on innovation and advanced technology. The company fosters a forward-thinking environment to deliver top-tier solutions to its clients.Job DescriptionTake ownership of day-to-day operations of data platforms and pipelines, including capacity management, stability tuning, upgrades, deployments, and recovery drills to ensure high availability and low latency.Design and manage multi-source data ingestion, including external exchanges, internal systems, and third-party sources, while ensuring robust protocol parsing and retry mechanisms.Build and implement rule-based and statistical data quality checks (e.g., completeness, uniqueness, time alignment, anomaly detection, error handling).Develop automated remediation flows, reconciliation processes, and historical backfilling capabilities.Establish monitoring and alerting frameworks to guarantee production-grade, trustworthy datasets.Enforce data access controls, encryption, auditing, and data classification to comply with internal governance and external regulatory standards, including PII management.Plan and maintain scalable ETL/ELT pipelines, covering scheduling, caching, partitioning, modeling, schema evolution, and data lineage for both batch and real-time streams.Apply Infrastructure-as-Code, data versioning, data tests, and CI/CD practices to improve platform predictability and reduce operational risks.Contribute to embedded GenAI and LLM-powered data applications supporting analytics, reconciliation, and enterprise productivity use cases.Collaborate with analytics and product teams to operationalize AI-driven data solutions.The Successful ApplicantBachelor's degree in Computer Science, Engineering, Information Technology, or a related field.Extensive experience in data engineering, data platform architecture, or AI/ML engineering.Strong hands-on experience with modern cloud data platforms such as Snowflake, Databricks, BigQuery, or Redshift.Practical experience building BI data foundations and supporting GenAI/LLM architectures.Proficiency in SQL, workflow orchestration tools (e.g., Airflow), streaming technologies (e.g., Kafka), and modern pipeline design methodologies.Solid understanding of data warehouse development lifecycles and dimensional modeling principles.Familiarity with GitLab and CI/CD pipelines.Strong debugging, performance optimization, and problem‑solving skills.Working knowledge of data governance, data lineage, privacy, and security frameworks.What's on OfferCompetitive salary and benefits package.20 days of annual leave with WFH policy.Permanent position with career growth opportunities.Exposure to cutting-edge technologies within the insurance industry.Collaborative and supportive work environment.If you are ready to take the next step in your career as a Senior Data Engineer, apply now to join an innovative team in the insurance sector.ContactLewis LiuQuote job refJN-032026-6963272Phone number+85225306126Job summaryFunctionITSpecialisationIT Data AnalysisWhat is your area of specialisation?InsuranceLocationKwun TongJob Type:PermanentConsultant nameLewis LiuConsultant phone+85225306126Job ReferenceJN-032026-6963272Work from HomeWork from Home or Hybrid