About this role
<p>• <strong>🧠 Tech Level:</strong> Senior</p><p class="p1">• <strong>🗣 Language Proficiency:</strong> Upper-Intermediate</p><p class="p1">• <strong>👥 FTE:</strong> 5</p><p class="p1">• <strong>🧾 Employment type:</strong> Full time</p><p class="p1">• <strong>🌍 Candidate Location:</strong><span class="Apple-converted-space"> </span>EU, USA</p><p class="p1">• <strong>🕐 Working Time Zone:</strong> CET, PST (overlap for several hours)</p><p class="p1">• <strong>🚀 Start:</strong> April 2026</p><p class="p2"></p><p class="p1"><span class="Apple-converted-space"><strong> </strong></span><strong>🧩 Project Description:</strong></p><p class="p1">The project focuses on the integration of leading data and AI platforms into an insurance delivery framework to create a unified, data-driven, and AI-enabled insurance operations platform.</p><p class="p2"></p><p class="p1"><strong>⚙️ Project Phase:</strong> Active development</p><p class="p2"></p><p class="p1"><strong>🤝 Soft Skills:</strong></p><p class="p1">• Stakeholder management</p><p class="p1">• Strong communication with technical and non-technical users</p><p class="p1">• Business and product thinking</p><p class="p1">• User empathy</p><p class="p1">• Ability to work in ambiguous environments</p><p class="p1">• High ownership and accountability</p><p class="p1">• Comfort working under pressure</p><p class="p1">• Willingness to travel and work on-site</p><p class="p2"></p><p class="p1"><strong>💡 Hard Skills / Must Have:</strong></p><p class="p1">• 2–3+ years of hands-on experience with Palantir Foundry, including proven production deployment and customer implementation cases</p><p class="p1">• Strong proficiency in Python, PySpark, and SQL; experience with Java or Scala is a plus</p><p class="p1">• Experience building and operating distributed data pipelines (Spark, large-scale ETL/ELT, data modelling)</p><p class="p1">• Experience integrating enterprise systems via REST APIs, JDBC, SFTP, and cloud data platforms</p><p class="p1">• Experience with or strong understanding of AI-enabled applications, LLM workflows, and RAG systems</p><p class="p1">• Bachelor’s or Master’s degree in Computer Science, Mathematics, Physics, or a related technical field</p><p class="p1">• Certification: Palantir Certified Data Engineer or Application Developer is highly preferred</p><p class="p2"></p><p class="p1"><strong>📌 Responsibilities and Tasks:</strong></p><p class="p1">•<span class="Apple-converted-space"> </span>Design and implement end-to-end data pipelines in Palantir Foundry</p><p class="p1">• Integrate data from ERP, CRM, financial, operational, and external systems</p><p class="p1">• Build and maintain the Enterprise Ontology (Object Types, Link Types, Actions)</p><p class="p1">• Develop operational applications and dashboards using platform-specific tools</p><p class="p1">• Build AI-powered workflows using AI platforms (RAG, LLM agents, evaluations)</p><p class="p1">• Work directly with customer stakeholders (business, operations, IT, leadership)</p><p class="p1">• Translate business problems into data models, applications, and AI solutions</p><p class="p1">• Support production deployments, performance tuning, and scaling</p><p class="p1">• Work on-site with customers as required</p><p class="p2"></p><p class="p1"><strong>🧪 Technology Stack:</strong> Palantir Foundry and AIP, Python, PySpark, SQL, Spark, REST/JDBC, industry-standard insurance software, cloud data platforms, AI/LLM systems, TypeScript, React<br><br><br><strong>📩 Ready to Join?</strong><br>We look forward to receiving your application and welcoming you to our team!</p>