Job Title | Location | Description | Posted** |
---|---|---|---|
Manager, Data Engineering [Remote-US]
Quanata |
San Francisco, CA
|
To help keep everyone safe we encourage all applicants to pay close attention to protect themselves during their job search. When applying for a position online you are at risk of being targeted by malicious actors looking for personal data. Please be aware we will only reach out via email using the domain quanata.com. Anything that does not match those domains should be ignored and considered a security risk. About Us Quanata is on a mission to help ensure a better world through context-based insurance solutions. We are an exceptional customer centered team with a passion for creating innovative technologies digital products and brands. We blend some of the best Silicon Valley talent and cutting-edge thinking with the long-term backing of leading insurer State Farm. Learn more about us and our work at quanata.com Our Team From data scientists and actuaries to engineers designers and marketers we're a world class team of tech-minded professionals from some of the best companies in Silicon Valley and around the world. We've come together to create the context-based insurance solutions and experiences of the future. We know that the key to our success isn't just about nailing the technology—it's hiring the talented people who will help us continue to make a quantifiable impact.#### The role We're looking for an accomplished and well-rounded manager to lead the Data Engineering team in designing building and maintaining scalable data pipelines and storage systems that support data science analytics and data-driven decision-making. This role is a critical intersection of technical expertise team leadership and strategic planning aimed at optimizing the data architecture to meet the evolving business needs. #### Your day-to-day Lead mentor and grow a high-performing team of Data Engineers fostering a culture of engineering excellence innovation and cross-functional collaboration. Drive the team's design and implementation of automated CI/CD/CT pipelines and ETL/ELT processes to accelerate the entire model lifecycle from data validation and training to production deployment and monitoring. Establish Data Engineering best practices to ensure data accuracy consistency security and adherence to governance frameworks. Monitor the performance of data systems and implement improvements to enhance efficiency and reduce latency. Partner with key stakeholders data analysts and business leaders to define platform requirements drive adoption and ensure the DE team delivers measurable business value. Oversee the integration of security controls (DevSecOps) throughout the data engineering lifecycle including data encryption access management and vulnerability scanning to protect sensitive data and meet industry standards. #### About you BS degree in Computer Science engineering or related field or equivalent relevant experience 6-8 years of professional experience in data management software engineering or DevOps engineering Proven leadership experience with a track record of managing and developing technical teams Excellent communication and stakeholder management skills with a proven ability to partner with business leaders define technical roadmaps and drive consensus across teams. Deep hands-on expertise with Infrastructure as Code (IaC) using Terraform and container orchestration with Kubernetes and Docker in production environments Extended experience working in cloud-native environment with strong preference toward AWS Strong programming skills in Python and a solid understanding of the end-to-end machine learning lifecycle from experimentation to production monitoring. Problem-Solving: Ability to resolve complex data architecture challenges and drive technical roadmaps. #### Bonus points Experience in regulated industries: A strong understanding of the technical controls required for compliance frameworks like SOX HIPAA or PCI DSS within industries such as Finance Insurance or Healthcare. Deep domain knowledge in Financial Services or Insurance: A proven understanding of specific use cases like automated claims processing intelligent underwriting and fraud detection. Salary: $315000 to $380000 Please note that the final salary offered will be determined based on the selected candidate's skills and experience as well as the internal salary structure at Quanata. Our aim is to offer a competitive and equitable compensation package that reflects the candidate's expertise and contributions to our organization. Additional Details: Benefits: We provide a wide variety of health wellness and other benefits.These include medical dental vision life insurance and supplemental income plans for you and your dependents a Headspace app subscription monthly wellness allowance and a 401(k) Plan with a company match. Work from Home Equipment: Given our virtual environment— in order to set you up for success at home a one-time payment of $2K will be provided to cover the purchase of in-home office equipment and furniture at your discretion. Also our teams work with MacBook Pros which we will deliver to you fully provisioned prior to your first day. Paid Time Off: All employees accrue four weeks of PTO in their first year of employment. New parents receive twelve weeks of fully paid parental leave which may be taken within one year after the birth and/or adoption of a child. The twelve weeks is applicable to both birthing and non-birthing parent. Personal and Professional Development: We're committed to investing in and helping our people grow personally and professionally. All employees receive up to $5000 each year for professional learning continuing education and career development. All team members also receive LinkedIn Learning subscriptions and access to multiple different coaching opportunities through BetterUp. Location: We are a remote-first company for most positions so you may work from anywhere you like in the U.S excluding U.S. territories. For most positions occasional travel may be requested or encouraged but is not required. Some positions might require travel per the job description provided to the employee. Employees based in the San Francisco Bay Area or in Providence Rhode Island may commute to one of our local offices as desired. Hours: We maintain core meeting hours from 9AM - 2PM Pacific time for collaborating with team members across all time zones. Quanata LLC is an equal opportunity workplace. We are committed to equal employment opportunities regardless of race color ancestry religion sex national origin sexual orientation age citizenship marital status disability gender identity or Veteran status. Pursuant to the San Francisco Fair Chance Ordinance we will consider for employment qualified applicants with arrest and conviction records. If you are a San Francisco resident please read the City and County of San Francisco's Fair Chance Ordinance notice. https://www.sf.gov/sites/https://www.sf.gov/sites/default/files/2022-12/FCO%20poster20200.pdf This role is employed by Quanata LLC which is a separate company in the State Farm family of companies. If you require a reasonable accommodation please reach out to your Talent Acquisition Partner for assistance.
|
|
Data Engineer (REMOTE)
Koniag Government Services |
|
Koniag IT Systems LLC a Koniag Government Services company is seeking a Data Engineer to support KITS and our government customer. This position requires the candidate to be able to obtain a Public Trust. This is a remote opportunity. We offer competitive compensation and an extraordinary benefits package including health dental and vision insurance 401K with company matching flexible spending accounts paid holidays three weeks paid time off and more. Education and Experience Required: Bachelor’s degree in computer science Information Technology Data Engineering or a related field 5+ years of experience in data engineering data architecture or similar technical roles Demonstrated expertise in database systems data warehousing and ETL processes 5+ years as Python programming language expert Required Skills and Competencies: Data Engineering and Development Strong SQL skills and experience with relational and non-relational databases Knowledge of data modeling schema design principles and data normalization Experience with data migrations from legacy data systems (e.g. MS Access SQL Server SharePoint) Expert in programming languages such as Python Java or Scala with experience in building data pipelines Experience with big data technologies (e.g. Hadoop Spark Kafka) and their application in ETL workflows Familiarity with data processing in cloud platforms such as AWS Azure or Google Cloud DevOps and CI/CD: Proven experience setting up and maintaining CI/CD pipelines for data engineering workflows Hands-on experience with CI/CD tools such as GitHub actions AWS Code Pipeline and Azure pipeline Expertise in containerization and orchestration tools like Docker and Kubernetes Experience with Infrastructure as Code (IaC) tools like Terraform AWS Cloud Formation or Azure Resource Manager Familiarity with version control (Git) and best practices in source code management Server-less and Cloud Functions: Experience building and deploying server-less functions including AWS Lambda and Azure Functions Ability to design scalable and efficient server-less workflows for data ingestion processing and transformation Knowledge of event-driven architecture and experience implementing automated triggers and workflows Desired Skills and Competencies: Data Visualization and Analytics Experience with data visualization tools (e.g. Tableau Power BI) for developing reports and dashboards Familiarity with machine learning and data science concepts to support advanced analytics Compliance and Governance: Understanding of data governance frameworks security best practices and compliance standards (e.g. GLBA FINRA SEC Regulations FFIEC) Experience with data security encryption and identity/access management Industry Experience: Background in government or federal IT projects is a plus with knowledge of relevant compliance and regulatory standards Other Key Skills: Excellent problem-solving and analytical abilities with attention to detail. Strong communication and collaboration skills in cross-functional and distributed teams. Ability to obtain Public Trust Clearance Our Equal Employment Opportunity Policy The company is an equal opportunity employer. The company shall not discriminate against any employee or applicant because of race color religion creed ethnicity sex sexual orientation gender or gender identity (except where gender is a bona fide occupational qualification) national origin or ancestry age disability citizenship military/veteran status marital status genetic information or any other characteristic protected by applicable federal state or local law. We are committed to equal employment opportunity in all decisions related to employment promotion wages benefits and all other privileges terms and conditions of employment. The company is dedicated to seeking all qualified applicants. If you require an accommodation to navigate or apply for a position on our website please get in touch with Heaven Wood via e-mail at accommodations@koniag-gs.com or by calling 703-488-9377 to request accommodations. Koniag Government Services (KGS) is an Alaska Native Owned corporation supporting the values and traditions of our native communities through an agile employee and corporate culture that delivers Enterprise Solutions Professional Services and Operational Management to Federal Government Agencies. As a wholly owned subsidiary of Koniag we apply our proven commercial solutions to a deep knowledge of Defense and Civilian missions to provide forward leaning technical professional and operational solutions. KGS enables successful mission outcomes for our customers through solution-oriented business partnerships and a commitment to exceptional service delivery. We ensure long-term success with a continuous improvement approach while balancing the collective interests of our customers employees and native communities. For more information please visit www.koniag-gs.com. Equal Opportunity Employer/Veterans/Disabled. Shareholder Preference in accordance with Public Law 88-352
|
|
Data Engineer (REMOTE)
Koniag Government Services |
Remote United States
|
Koniag IT Systems LLC a Koniag Government Services company is seeking a Data Engineer to support KITS and our government customer. This position requires the candidate to be able to obtain a Public Trust. This is a remote opportunity. We offer competitive compensation and an extraordinary benefits package including health dental and vision insurance 401K with company matching flexible spending accounts paid holidays three weeks paid time off and more. Education and Experience Required: Bachelor’s degree in computer science Information Technology Data Engineering or a related field 5+ years of experience in data engineering data architecture or similar technical roles Demonstrated expertise in database systems data warehousing and ETL processes 5+ years as Python programming language expert Required Skills and Competencies: Data Engineering and Development Strong SQL skills and experience with relational and non-relational databases Knowledge of data modeling schema design principles and data normalization Experience with data migrations from legacy data systems (e.g. MS Access SQL Server SharePoint) Expert in programming languages such as Python Java or Scala with experience in building data pipelines Experience with big data technologies (e.g. Hadoop Spark Kafka) and their application in ETL workflows Familiarity with data processing in cloud platforms such as AWS Azure or Google Cloud DevOps and CI/CD: Proven experience setting up and maintaining CI/CD pipelines for data engineering workflows Hands-on experience with CI/CD tools such as GitHub actions AWS Code Pipeline and Azure pipeline Expertise in containerization and orchestration tools like Docker and Kubernetes Experience with Infrastructure as Code (IaC) tools like Terraform AWS Cloud Formation or Azure Resource Manager Familiarity with version control (Git) and best practices in source code management Server-less and Cloud Functions: Experience building and deploying server-less functions including AWS Lambda and Azure Functions Ability to design scalable and efficient server-less workflows for data ingestion processing and transformation Knowledge of event-driven architecture and experience implementing automated triggers and workflows Desired Skills and Competencies: Data Visualization and Analytics Experience with data visualization tools (e.g. Tableau Power BI) for developing reports and dashboards Familiarity with machine learning and data science concepts to support advanced analytics Compliance and Governance: Understanding of data governance frameworks security best practices and compliance standards (e.g. GLBA FINRA SEC Regulations FFIEC) Experience with data security encryption and identity/access management Industry Experience: Background in government or federal IT projects is a plus with knowledge of relevant compliance and regulatory standards Other Key Skills: Excellent problem-solving and analytical abilities with attention to detail. Strong communication and collaboration skills in cross-functional and distributed teams. Ability to obtain Public Trust Clearance Our Equal Employment Opportunity Policy The company is an equal opportunity employer. The company shall not discriminate against any employee or applicant because of race color religion creed ethnicity sex sexual orientation gender or gender identity (except where gender is a bona fide occupational qualification) national origin or ancestry age disability citizenship military/veteran status marital status genetic information or any other characteristic protected by applicable federal state or local law. We are committed to equal employment opportunity in all decisions related to employment promotion wages benefits and all other privileges terms and conditions of employment. The company is dedicated to seeking all qualified applicants. If you require an accommodation to navigate or apply for a position on our website please get in touch with Heaven Wood via e-mail at accommodations@koniag-gs.com or by calling 703-488-9377 to request accommodations. Koniag Government Services (KGS) is an Alaska Native Owned corporation supporting the values and traditions of our native communities through an agile employee and corporate culture that delivers Enterprise Solutions Professional Services and Operational Management to Federal Government Agencies. As a wholly owned subsidiary of Koniag we apply our proven commercial solutions to a deep knowledge of Defense and Civilian missions to provide forward leaning technical professional and operational solutions. KGS enables successful mission outcomes for our customers through solution-oriented business partnerships and a commitment to exceptional service delivery. We ensure long-term success with a continuous improvement approach while balancing the collective interests of our customers employees and native communities. For more information please visit www.koniag-gs.com. Equal Opportunity Employer/Veterans/Disabled. Shareholder Preference in accordance with Public Law 88-352
|
|
Azure Data Engineering Architect (Remote - India)
Jobgether |
|
This position is posted by Jobgether on behalf of a partner company. We are currently looking for an Azure Data Engineering Architect in India. We are seeking a highly experienced Azure Data Engineering Architect to lead and shape data platform initiatives. In this role you will define architecture patterns implement best practices and create reusable accelerators for large-scale Azure data solutions. You will work closely with cross-functional teams guiding technical decisions mentoring junior engineers and ensuring the delivery of high-quality scalable data platforms. This position requires a strong blend of hands-on technical expertise strategic thinking and client-facing communication skills. You will have the opportunity to influence cloud adoption strategies drive innovation in data engineering and contribute to complex high-impact projects. A deep understanding of Azure services data integration patterns and ETL/ELT frameworks is essential. Accountabilities Lead the Azure pillar within the Data Engineering CoE establishing technical standards and best practices for Azure data platform implementations Design architecture patterns and reusable accelerators for Azure Synapse Analytics Data Factory Azure Databricks and other Azure services Support pre-sales activities providing technical guidance and contributing to proposals for Azure-based opportunities Plan and design migration pathways from legacy systems to Azure environments Create technical documentation playbooks and implementation frameworks to standardize Azure deployments Mentor and coach junior team members on Azure best practices and architecture principles Collaborate with other architects to deliver cross-platform solutions using cloud-agnostic tools such as Snowflake and Databricks Serve as a trusted technical advisor for clients building strong relationships and guiding Azure implementations Requirements 10+ years of experience in data engineering with at least 5 years focused on Azure technologies Deep expertise in Azure Synapse Analytics Azure Data Factory Azure Databricks and Azure Data Lake Storage Gen2 Proven experience in designing and implementing ETL/ELT pipelines and data integration patterns Experience migrating legacy systems to Azure and developing reusable templates or accelerators Strong coding skills in Python SQL and Spark experience with .NET is a plus Familiarity with Azure DevOps ARM templates Bicep Terraform and other Infrastructure as Code tools Microsoft Certified: Azure Data Engineer Associate or higher Excellent communication skills and experience in client-facing roles Ability to mentor team members and drive adoption of best practices Strong problem-solving and analytical skills with a results-oriented mindset Benefits Competitive salary and performance-based incentives Flexible remote work environment Opportunity to work on high-impact innovative Azure data platform projects Professional growth and mentorship from senior technical leaders Exposure to a variety of Azure services and cloud-native architectures Health and wellness benefits Collaborative supportive and knowledge-sharing team culture Jobgether is a Talent Matching Platform that partners with companies worldwide to efficiently connect top talent with the right opportunities through AI-driven job matching. When you apply your profile goes through our AI-powered screening process designed to identify top talent efficiently and fairly: 🔍 Our AI evaluates your CV and LinkedIn profile thoroughly analyzing your skills experience and achievements. 📊 It compares your profile to the job's core requirements and past success factors to determine your match score. 🎯 Based on this analysis we automatically shortlist the 3 candidates with the highest match to the role. 🧠 When necessary our human team may perform an additional manual review to ensure no strong profile is missed. The process is transparent skills-based and free of bias — focusing solely on your fit for the role. Once the shortlist is completed we share it directly with the company that owns the job opening. The final decision and next steps (such as interviews or additional assessments) are then made by their internal hiring team. Thank you for your interest!
|
|
Senior Backend Engineer — C#, AWS, Data-Intensive Systems (Remote - Mexico Only)
varicent |
Guadalajara, Mexico
|
At Varicent we’re not just transforming the Sales Performance Management (SPM) market—we’re redefining how organizations achieve revenue success. Our cutting-edge SaaS solutions empower revenue leaders globally to design smarter go-to-market strategies maximize seller performance and unlock untapped potential. Varicent stands at the forefront of innovation celebrated as a market leader in the 2025 Forrester Wave Report for SPM 2023 Ventana Research Revenue Performance Management (RPM) Value Index Gartner Peer Insights 2024 Gartner SPM Market Guide and G2. Our solutions are trusted by a diverse range of global industry leaders like T-Mobile ServiceNow Wawanesa Bank Shaw Industries Moody's Stryker and hundreds more. Here’s why you’ll thrive at Varicent: Innovate with Purpose: Build impactful solutions for customers worldwide. Join Excellence: Work in a diverse collaborative and innovative team. Shape the Future: Lead in redefining revenue optimization. Grow Together: Unlock your potential in a supportive environment. Join us at Varicent—where your talent and ambition meet limitless opportunities for success! About the RoleJoin our dynamic Incentives Development team where innovation collaboration and technical excellence drive everything we do. As a Software Developer you’ll contribute to the Varicent Incentives platform—a single-page web app. You’ll work closely with developers designers and data scientists to build modern scalable features that deliver exceptional user experiences. What You’ll Do Develop and refine features to support client success and platform growth Collaborate in a cross-functional knowledge-sharing team Work with modern tools and open-source technologies where applicable Troubleshoot optimize and maintain robust code What We’re Looking For 5 to 8 years of relevant backend development experience ideally with data-intensive systems. Strong proficiency in at least one backend language (C# Java C++). Hands-on AWS experience (EC2 S3 Lambda RDS/Redshift DynamoDB etc.). Knowledge of ELT/ETL concepts and large-data challenges. Solid software design fundamentals and passion for clean maintainable code. Strong communication skills and English level B2 or higher 🟢 Candidates must be based in Mexico📄 Only resumes submitted in English will be considered FIRST 30 DAYS: GETTING STARTED Learn and get familiar with our development process codebase coding standards and tools. Participate in team meetings collaborate with colleagues and effectively communicate progress and challenges. Complete small well-defined tasks or bug fixes to demonstrate ability to write good code. FIRST 60 DAYS: BUILDING MOMENTUM Become more independent in handling assigned tasks requiring less supervision and guidance. Demonstrate ability to take on more complex assignments such as implementing new features or modules. Demonstrate an improved ability to troubleshoot and resolve issues efficiently. Adhere to coding standards produce well-documented and maintainable code and participate in code reviews. Collaborate with the team and provide valuable input during discussions. FIRST 90 DAYS: HITTING YOUR STRIDE Meet project deadlines consistently and deliver high-quality work. Learn continuously and keep up with industry trends and new technologies. Identify areas of improvement in the development process and suggest solutions to enhance efficiency and productivity. Make a positive impact on the project and the team's success. Note Candidates selected for this position will be hired by Varicent’s designated professional employer organization. Your employment may be transferred to a local Varicent entity in the future. In such event Varicent may recognize your seniority and provide you with comparable role responsibilities and benefits. We are excited to welcome you and support you throughout this journey! This role requires employees to work within Eastern Standard Time (EST) business hours. While we are open to candidates from outside the EST time zone please be prepared to adjust your working hours to align with this time zone. Flexibility will be essential to ensure seamless collaboration with the team and stakeholders. This position is fully remote. We embrace a results-driven work culture focusing on performance and collaboration over location. As part of our team you’ll have the opportunity to build a work-life balance that suits you while staying connected with a diverse global team through virtual tools and regular online communication. Whether you're working from home or a co-working space we’re committed to supporting you with the resources and autonomy needed to succeed in a remote environment. Benefits Market Leading Compensation Package. Wellness Programs to Support Health and Wellbeing. Working with the latest tools and technologies in a fast-paced environment. Remote Work Flexibility. Comprehensive Employee Insurance Coverage: Medical Dental Vision Life Insurance. Annual Time Off: Time off is provided in accordance with applicable legislative requirements. Global Connected Culture: Hubs in Romania UK US Canada. Dynamic Work Culture: Thrive in our innovative and multicultural environment. Grow with Us: Continuous development opportunities. Varicent is committed to creating a diverse environment and is proud to be an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race color religion gender gender identity or expression sexual orientation national origin genetics disability age or veteran status. If you require accommodation at any time during the recruitment process please email accomodations@varicent.com Varicent is also committed to compliance with all fair employment practices regarding citizenship and immigration status. By applying for a position at Varicent and/or by using this portal you declare and confirm that you have read and agree to our Job Applicant Privacy Notice and that the information provided by you as part of your application is true and complete and includes no misrepresentation or material omission of fact
|
|
Junior Data Engineer - Databricks (Remote - US)
Lensa |
Reston, VA
|
"Lensa is a career site that helps job seekers find great jobs in the US. We are not a staffing firm or agency. Lensa does not hire directly for these jobs but promotes jobs on LinkedIn on behalf of its direct clients recruitment ad agencies and marketing partners. Lensa partners with DirectEmployers to promote this job for ICF. Clicking ""Apply Now"" or ""Read more"" on Lensa redirects you to the job board/employer site. Any information collected there is subject to their terms and privacy notice. Description Please note: This role is contingent upon a contract award. While it is not an immediate opening we are actively conducting interviews and extending offers in anticipation of the award. ICF is seeking a Junior Databricks Engineer to support a major data migration initiative with the Department of Transportation. This project involves transitioning from a legacy Oracle-based system to a modern data analytics platform integrating dozens of data sources and thousands of reports. The role is critical to ensuring successful migration pipeline development and data governance implementation. Job Location: This is a remote position but all work must be performed within the United States. What You’ll Be Doing Support the migration of data from legacy Oracle systems to Databricks. Develop and maintain ETL pipelines using Spark Python and SQL. Assist in setting up and managing Unity Catalog for data governance. Perform QA and validation of migrated datasets and reports. Collaborate with the engineering team to ensure smooth integration and performance. Report to the Senior Data Engineer and contribute to cross-functional team efforts. What You Must Have Bachelor’s degree in computer science Information Systems Analytics or a related field. At least 1 year of experience working with Databricks Spark Python SQL or Unity Catalog. Must be eligible for federal clearance and pass a background check. Candidate must have lived in the U.S. for three (3) full years out of the last five (5) years What We’d Like You To Have (preferred) Experience with Oracle databases and Oracle Business Intelligence. Familiarity with government data systems and compliance requirements. Exposure to large-scale data migration projects. Understanding of Agile techniques such as User Stories TDD BDD CI/CD and automated testing. Strong understanding of data engineering best practices. Experience working in Agile environments. Professional Skills Ability to work independently in a fast-paced deadline-driven setting. Working at ICF ICF is a global advisory and technology services provider but we’re not your typical consultants. We combine unmatched expertise with cutting-edge technology to help clients solve their most complex challenges navigate change and shape the future. We can only solve the world's toughest challenges by building a workplace that allows everyone to thrive. We are an equal opportunity employer. Together our employees are empowered to share their expertise and collaborate with others to achieve personal and professional goals. For more information please read our EEO (https://www.icf.com/legal/equal-employment-opportunity) policy. We will consider for employment qualified applicants with arrest and conviction records. Reasonable Accommodations are available including but not limited to for disabled veterans individuals with disabilities and individuals with sincerely held religious beliefs in all phases of the application and employment process. To request an accommodation please email Candidateaccommodation@icf.com and we will be happy to assist. All information you provide will be kept confidential and will be used only to the extent required to provide needed reasonable accommodations. Read more about workplace discrimination righ t s or our benefit offerings which are included in the Transparency in (Benefits) Coverage Act. Candidate AI Usage Policy At ICF we are committed to ensuring a fair interview process for all candidates based on their own skills and knowledge. As part of this commitment the use of artificial intelligence (AI) tools to generate or assist with responses during interviews (whether in-person or virtual) is not permitted. This policy is in place to maintain the integrity and authenticity of the interview process. However we understand that some candidates may require accommodation that involves the use of AI. If such an accommodation is needed candidates are instructed to contact us in advance at candidateaccommodation@icf.com . We are dedicated to providing the necessary support to ensure that all candidates have an equal opportunity to succeed. Pay Range - There are multiple factors that are considered in determining final pay for a position including but not limited to relevant work experience skills certifications and competencies that align to the specified role geographic location education and certifications as well as contract provisions regarding labor categories that are specific to the position. The pay range for this position based on full-time employment is: $73722.00 - $125327.00 Nationwide Remote Office (US99) If you have questions about this posting please contact support@lensa.com"
|
|
Junior Data Engineer - Databricks (Remote - US)
Lensa |
Reston, VA
|
"Lensa is a career site that helps job seekers find great jobs in the US. We are not a staffing firm or agency. Lensa does not hire directly for these jobs but promotes jobs on LinkedIn on behalf of its direct clients recruitment ad agencies and marketing partners. Lensa partners with DirectEmployers to promote this job for ICF. Clicking ""Apply Now"" or ""Read more"" on Lensa redirects you to the job board/employer site. Any information collected there is subject to their terms and privacy notice. Description Please note: This role is contingent upon a contract award. While it is not an immediate opening we are actively conducting interviews and extending offers in anticipation of the award. ICF is seeking a Junior Databricks Engineer to support a major data migration initiative with the Department of Transportation. This project involves transitioning from a legacy Oracle-based system to a modern data analytics platform integrating dozens of data sources and thousands of reports. The role is critical to ensuring successful migration pipeline development and data governance implementation. Job Location: This is a remote position but all work must be performed within the United States. What You’ll Be Doing Support the migration of data from legacy Oracle systems to Databricks. Develop and maintain ETL pipelines using Spark Python and SQL. Assist in setting up and managing Unity Catalog for data governance. Perform QA and validation of migrated datasets and reports. Collaborate with the engineering team to ensure smooth integration and performance. Report to the Senior Data Engineer and contribute to cross-functional team efforts. What You Must Have Bachelor’s degree in computer science Information Systems Analytics or a related field. At least 1 year of experience working with Databricks Spark Python SQL or Unity Catalog. Must be eligible for federal clearance and pass a background check. Candidate must have lived in the U.S. for three (3) full years out of the last five (5) years What We’d Like You To Have (preferred) Experience with Oracle databases and Oracle Business Intelligence. Familiarity with government data systems and compliance requirements. Exposure to large-scale data migration projects. Understanding of Agile techniques such as User Stories TDD BDD CI/CD and automated testing. Strong understanding of data engineering best practices. Experience working in Agile environments. Professional Skills Ability to work independently in a fast-paced deadline-driven setting. Working at ICF ICF is a global advisory and technology services provider but we’re not your typical consultants. We combine unmatched expertise with cutting-edge technology to help clients solve their most complex challenges navigate change and shape the future. We can only solve the world's toughest challenges by building a workplace that allows everyone to thrive. We are an equal opportunity employer. Together our employees are empowered to share their expertise and collaborate with others to achieve personal and professional goals. For more information please read our EEO (https://www.icf.com/legal/equal-employment-opportunity) policy. We will consider for employment qualified applicants with arrest and conviction records. Reasonable Accommodations are available including but not limited to for disabled veterans individuals with disabilities and individuals with sincerely held religious beliefs in all phases of the application and employment process. To request an accommodation please email Candidateaccommodation@icf.com and we will be happy to assist. All information you provide will be kept confidential and will be used only to the extent required to provide needed reasonable accommodations. Read more about workplace discrimination righ t s or our benefit offerings which are included in the Transparency in (Benefits) Coverage Act. Candidate AI Usage Policy At ICF we are committed to ensuring a fair interview process for all candidates based on their own skills and knowledge. As part of this commitment the use of artificial intelligence (AI) tools to generate or assist with responses during interviews (whether in-person or virtual) is not permitted. This policy is in place to maintain the integrity and authenticity of the interview process. However we understand that some candidates may require accommodation that involves the use of AI. If such an accommodation is needed candidates are instructed to contact us in advance at candidateaccommodation@icf.com . We are dedicated to providing the necessary support to ensure that all candidates have an equal opportunity to succeed. Pay Range - There are multiple factors that are considered in determining final pay for a position including but not limited to relevant work experience skills certifications and competencies that align to the specified role geographic location education and certifications as well as contract provisions regarding labor categories that are specific to the position. The pay range for this position based on full-time employment is: $73722.00 - $125327.00 Nationwide Remote Office (US99) If you have questions about this posting please contact support@lensa.com"
|
|
Junior Data Engineer - Databricks (Remote - US)
Lensa |
Reston, VA
|
"Lensa is a career site that helps job seekers find great jobs in the US. We are not a staffing firm or agency. Lensa does not hire directly for these jobs but promotes jobs on LinkedIn on behalf of its direct clients recruitment ad agencies and marketing partners. Lensa partners with DirectEmployers to promote this job for ICF. Clicking ""Apply Now"" or ""Read more"" on Lensa redirects you to the job board/employer site. Any information collected there is subject to their terms and privacy notice. Description Please note: This role is contingent upon a contract award. While it is not an immediate opening we are actively conducting interviews and extending offers in anticipation of the award. ICF is seeking a Junior Databricks Engineer to support a major data migration initiative with the Department of Transportation. This project involves transitioning from a legacy Oracle-based system to a modern data analytics platform integrating dozens of data sources and thousands of reports. The role is critical to ensuring successful migration pipeline development and data governance implementation. Job Location: This is a remote position but all work must be performed within the United States. What You'll Be Doing Support the migration of data from legacy Oracle systems to Databricks. Develop and maintain ETL pipelines using Spark Python and SQL. Assist in setting up and managing Unity Catalog for data governance. Perform QA and validation of migrated datasets and reports. Collaborate with the engineering team to ensure smooth integration and performance. Report to the Senior Data Engineer and contribute to cross-functional team efforts. What You Must Have Bachelor's degree in computer science Information Systems Analytics or a related field. At least 1 year of experience working with Databricks Spark Python SQL or Unity Catalog. Must be eligible for federal clearance and pass a background check. Candidate must have lived in the U.S. for three (3) full years out of the last five (5) years What We'd Like You To Have (preferred) Experience with Oracle databases and Oracle Business Intelligence. Familiarity with government data systems and compliance requirements. Exposure to large-scale data migration projects. Understanding of Agile techniques such as User Stories TDD BDD CI/CD and automated testing. Strong understanding of data engineering best practices. Experience working in Agile environments. Professional Skills Ability to work independently in a fast-paced deadline-driven setting. Working at ICF ICF is a global advisory and technology services provider but we're not your typical consultants. We combine unmatched expertise with cutting-edge technology to help clients solve their most complex challenges navigate change and shape the future. We can only solve the world's toughest challenges by building a workplace that allows everyone to thrive. We are an equal opportunity employer. Together our employees are empowered to share their expertise and collaborate with others to achieve personal and professional goals. For more information please read our?EEO (https://www.icf.com/legal/equal-employment-opportunity) policy. We will consider for employment qualified applicants with arrest and conviction records. Reasonable Accommodations are available including but not limited to for disabled veterans individuals with disabilities and individuals with sincerely held religious beliefs in all phases of the application and employment process. To request an accommodation please email? Candidateaccommodation@icf.com ?and we will be happy to assist. All information you provide will be kept confidential and will be used only to the extent required to provide needed reasonable accommodations.? Read more about? workplace?discrimination?righ t s or our benefit offerings which are included in the? Transparency in (Benefits) Coverage Act. Candidate AI Usage Policy At ICF we are committed to ensuring a fair interview process for all candidates based on their own skills and knowledge. As part of this commitment the use of artificial intelligence (AI) tools to generate or assist with responses during interviews (whether in-person or virtual) is not permitted. This policy is in place to maintain the integrity and authenticity of the interview process.? However we understand that some candidates may require accommodation that involves the use of AI. If such an accommodation is needed candidates are instructed to contact us in advance at? candidateaccommodation@icf.com . We are dedicated to providing the necessary support to ensure that all candidates have an equal opportunity to succeed. ? Pay Range - There are multiple factors that are considered in determining final pay for a position including but not limited to relevant work experience skills certifications and competencies that align to the specified role geographic location education and certifications as well as contract provisions regarding labor categories that are specific to the position. The pay range for this position based on full-time employment is: $73722.00 - $125327.00 Nationwide Remote Office (US99) If you have questions about this posting please contact support@lensa.com"
|
|
Senior Data Engineer - Remote
Optum |
San Diego, CA
|
Optum is a global organization that delivers care aided by technology to help millions of people live healthier lives The work you do with our team will directly improve health outcomes by connecting people with the care pharmacy benefits data and resources they need to feel their best. Here you will find a culture guided by inclusion talented peers comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. You'll enjoy the flexibility to work remotely from anywhere within the U.S. as you take on some tough challenges. Primary Responsibilities: Design build and operate scalable ELT/ETL pipelines (batch and streaming) with strong SLAs and observability Model source data into well-governed analytics-ready layers (staging marts) using dimensional and/or Data Vault patterns Lead the evolution of our lakehouse/warehouse architecture drive partitioning file layout and table format best practices Implement data quality testing and lineage at scale champion data contracts with upstream teams Optimize performance and cost across compute storage and query engines establish capacity plans and autoscaling Partner with product analytics and ML to define semantic layers metrics and feature/data products Uphold security privacy and governance standards manage access controls for PHI and sensitive data Mentor engineers review designs/PRs and contribute to standards runbooks and documentation Leverage enterprise-approved AI tools to enhance productivity and innovation by streamlining workflows and automating repetitive tasks You'll be rewarded and recognized for your performance in an environment that will challenge you and give you clear direction on what it takes to succeed in your role as well as provide development for other roles you may be interested in. Required Qualifications: 5+ years of professional data engineering experience in cloud environments (Azure or AWS) 5+ years Expert-level SQL and strong Python (or Scala) for data processing and tooling 3+ years of proven experience with a major warehouse (Snowflake BigQuery Redshift) and a lake/lakehouse stack (e.g. Databricks Spark Delta/Iceberg/Hudi) 3+ years of experience with solid data modeling skills comfortable designing star schemas and incremental pipelines Preferred Qualifications: AWS and Azure experience .NET C# experience GitHub Actions experience Previous healthcare background Hands-on with orchestration (Airflow Dagster Prefect) and modern ELT (dbt or equivalent) All employees working remotely will be required to adhere to UnitedHealth Group's Telecommuter Policy Pay is based on several factors including but not limited to local labor markets education work experience certifications etc. In addition to your salary we offer benefits such as a comprehensive benefits package incentive and recognition programs equity stock purchase and 401k contribution (all benefits are subject to eligibility requirements). No matter where or when you begin a career with us you'll find a far-reaching choice of benefits and incentives. The salary for this role will range from $89900 to $160600 annually based on full-time employment. We comply with all minimum wage laws as applicable. Application Deadline: This will be posted for a minimum of 2 business days or until a sufficient candidate pool has been collected. Job posting may come down early due to volume of applicants. At UnitedHealth Group our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race gender sexuality age location and income-deserves the opportunity to live their healthiest life. Today however there are still far too many barriers to good health which are disproportionately experienced by people of color historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission. UnitedHealth Group is an Equal Employment Opportunity employer under applicable law and qualified applicants will receive consideration for employment without regard to race national origin religion age color sex sexual orientation gender identity disability or protected veteran status or any other characteristic protected by local state or federal laws rules or regulations. UnitedHealth Group is a drug - free workplace. Candidates are required to pass a drug test before beginning employment.
|
|
Junior Data Engineer - Databricks (Remote - US)
Lensa |
Reston, VA
|
"Lensa is a career site that helps job seekers find great jobs in the US. We are not a staffing firm or agency. Lensa does not hire directly for these jobs but promotes jobs on LinkedIn on behalf of its direct clients recruitment ad agencies and marketing partners. Lensa partners with DirectEmployers to promote this job for ICF. Clicking ""Apply Now"" or ""Read more"" on Lensa redirects you to the job board/employer site. Any information collected there is subject to their terms and privacy notice. Description Please note: This role is contingent upon a contract award. While it is not an immediate opening we are actively conducting interviews and extending offers in anticipation of the award. ICF is seeking a Junior Databricks Engineer to support a major data migration initiative with the Department of Transportation. This project involves transitioning from a legacy Oracle-based system to a modern data analytics platform integrating dozens of data sources and thousands of reports. The role is critical to ensuring successful migration pipeline development and data governance implementation. Job Location: This is a remote position but all work must be performed within the United States. What You’ll Be Doing Support the migration of data from legacy Oracle systems to Databricks. Develop and maintain ETL pipelines using Spark Python and SQL. Assist in setting up and managing Unity Catalog for data governance. Perform QA and validation of migrated datasets and reports. Collaborate with the engineering team to ensure smooth integration and performance. Report to the Senior Data Engineer and contribute to cross-functional team efforts. What You Must Have Bachelor’s degree in computer science Information Systems Analytics or a related field. At least 1 year of experience working with Databricks Spark Python SQL or Unity Catalog. Must be eligible for federal clearance and pass a background check. Candidate must have lived in the U.S. for three (3) full years out of the last five (5) years What We’d Like You To Have (preferred) Experience with Oracle databases and Oracle Business Intelligence. Familiarity with government data systems and compliance requirements. Exposure to large-scale data migration projects. Understanding of Agile techniques such as User Stories TDD BDD CI/CD and automated testing. Strong understanding of data engineering best practices. Experience working in Agile environments. Professional Skills Ability to work independently in a fast-paced deadline-driven setting. Working at ICF ICF is a global advisory and technology services provider but we’re not your typical consultants. We combine unmatched expertise with cutting-edge technology to help clients solve their most complex challenges navigate change and shape the future. We can only solve the world's toughest challenges by building a workplace that allows everyone to thrive. We are an equal opportunity employer. Together our employees are empowered to share their expertise and collaborate with others to achieve personal and professional goals. For more information please read our EEO (https://www.icf.com/legal/equal-employment-opportunity) policy. We will consider for employment qualified applicants with arrest and conviction records. Reasonable Accommodations are available including but not limited to for disabled veterans individuals with disabilities and individuals with sincerely held religious beliefs in all phases of the application and employment process. To request an accommodation please email Candidateaccommodation@icf.com and we will be happy to assist. All information you provide will be kept confidential and will be used only to the extent required to provide needed reasonable accommodations. Read more about workplace discrimination righ t s or our benefit offerings which are included in the Transparency in (Benefits) Coverage Act. Candidate AI Usage Policy At ICF we are committed to ensuring a fair interview process for all candidates based on their own skills and knowledge. As part of this commitment the use of artificial intelligence (AI) tools to generate or assist with responses during interviews (whether in-person or virtual) is not permitted. This policy is in place to maintain the integrity and authenticity of the interview process. However we understand that some candidates may require accommodation that involves the use of AI. If such an accommodation is needed candidates are instructed to contact us in advance at candidateaccommodation@icf.com . We are dedicated to providing the necessary support to ensure that all candidates have an equal opportunity to succeed. Pay Range - There are multiple factors that are considered in determining final pay for a position including but not limited to relevant work experience skills certifications and competencies that align to the specified role geographic location education and certifications as well as contract provisions regarding labor categories that are specific to the position. The pay range for this position based on full-time employment is: $73722.00 - $125327.00 Nationwide Remote Office (US99) If you have questions about this posting please contact support@lensa.com"
|
* unlock: sign-up / login and use the searches from your home page
** job listings updated in real time 🔥
Login & search by other job titles, a specific location or any keyword.
Powerful custom searches are available once you login.