Remote Amazon Redshift Jobs

226 remote jobs*

Job Title Location Description Posted**
Database Administrator
LetsGetChecked
Miami, FL
LetsGetChecked is a global healthcare solutions company that provides the tools to manage health from home through health testing virtual care genetic sequencing and medication delivery for a wide range of health and wellness conditions. LetsGetChecked's end-to-end model includes manufacturing logistics lab analysis physician support and prescription fulfillment. Founded in 2015 and co-headquartered in Dublin and Atlanta LetsGetChecked empowers people to take control of their health and live longer happier lives. We're looking for a skilled Database Administrator whose primary focus will be on our SQL Server and PostgreSQL environments in AWS. You'll also gain significant exposure to our Amazon Redshift data warehouse clusters and other cutting-edge data technologies. If you're passionate about data eager to learn and want to advance your career in a dynamic environment this role is for you. ### Responsibilities Administer & Optimize: Take ownership of the tuning optimization and administration of our core database systems including MSSQL PostgreSQL (RDS/Aurora) and Redshift. Performance Monitoring: Proactively monitor the performance resource utilization and query throughput of our database systems using tools like DataDog Grafana CloudWatch and Splunk. Code & Troubleshoot: Identify and resolve performance bottlenecks in T-SQL and PL/pgSQL including stored procedures and work directly with our engineering teams to optimize code. Automate & Script: Develop and maintain automation scripts using Python or PowerShell for AWS Lambda functions and Octopus Deploy pipelines. Database Releases: Manage database releases across all development staging and production environments ensuring our high standards are consistently met. Maintain & Document: Keep our database documentation current and contribute to our knowledge base in Confluence. Support & Collaborate: Work efficiently through the JIRA ticket queue to handle database-related requests and collaborate with our talented team of engineers. ### What we are looking for… A BS in Computer Science or a related field or equivalent real-world experience. 3+ years of professional experience administering production databases with a strong focus on SQL Server and PostgreSQL in a cloud environment. Proven hands-on experience with AWS database services (RDS Aurora Redshift). Strong scripting skills in both T-SQL and PL/pgSQL. Solid experience with database automation using Python and/or PowerShell. A deep understanding of best practices for managing highly available 24/7 database environments. ### Preferred Experience Experience with other cloud platforms such as Microsoft Azure and/or GCP Cloud. Familiarity with NoSQL databases like MongoDB Atlas or other relational databases like MySQL. Experience with large-scale Cloud Migration projects. Active Database Certifications (e.g. AWS Certified Database – Specialty PostgreSQL Essentials/Advanced Certification Azure Database Administrator Associate). Experience using troubleshooting tools like wait statistics extended events or system views. Familiarity with CI/CD automation tools like Jenkins GitHub Actions or Octopus Deploy. Benefits: Alongside a salary of $135000 - $150000 (Depending on Experience) we offer a range of benefits including: Health dental & vision insurance 401k Matching contribution Employee Assistance Programme Annual Compensation Reviews Flexible PTO Policy and 3 paid volunteer days per year Free monthly LetsGetChecked tests as we are not only focused on the well being of our patients but also the well being of our teams A referral bonus programme to reward you for helping us hire the best talent Internal Opportunities and Careers Clinics to help you progress your career Maternity Paternity Parental and Wedding leave #LI-IF #LI-Remote Why LetsGetChecked At LetsGetChecked we are revolutionizing healthcare by making it more accessible convenient and personalized. Our mission is to empower individuals with the knowledge and tools they need to manage their health proactively so they can live longer happier lives. By joining our team you will be part of a dynamic and innovative company that is dedicated to improving lives through cutting-edge technology and compassionate care. We value our employees and invest in their growth offering opportunities for professional development and career advancement. Together we can make a meaningful impact on the future of healthcare and help people take control of their health journey. Join us in our commitment to transforming healthcare for the better. Our Commitment to Diversity Equity and Inclusion At LetsGetChecked we are committed to fostering an inclusive environment that celebrates diversity in all its forms. We believe that the diversity of thought background and experience strengthens our teams and drives innovation. We are an equal-opportunity employer and do not discriminate on the basis of race ethnicity religion color place of birth sex gender identity or expression sexual orientation age marital status military service status or disability status. Our goal is to ensure that everyone feels valued and empowered to thrive. To learn more about LetsGetChecked and our mission to help people live longer healthier lives please visit https://www.letsgetchecked.com/careers/
39 min(s). ago
View
Senior Reporting Engineer
DLOCAL
Remote Uruguay
Why should you join dLocal? dLocal enables the biggest companies in the world to collect payments in 40 countries in emerging markets. Global brands rely on us to increase conversion rates and simplify payment expansion effortlessly. As both a payments processor and a merchant of record where we operate we make it possible for our merchants to make inroads into the world’s fastest-growing emerging markets. By joining us you will be a part of an amazing global team that makes it all happen in a flexible remote-first dynamic culture with travel health and learning benefits among others. Being a part of dLocal means working with 1000+ teammates from 30+ different nationalities and developing an international career that impacts millions of people’s daily lives. We are builders we never run from a challenge we are customer-centric and if this sounds like you we know you will thrive in our team. We are looking for a Software Engineer who wants to build high-performance scalable enterprise-grade applications. You'll be part of a talented software team working on apps to deliver insights to big clients like Netflix Amazon Nike Facebook and more! ### What will I be doing? Contributing in all phases of the analytical application development life cycle Designing developing and delivering high-volume applications for data analytics systems Writing well-designed testable and efficient code Ensuring designs are in compliance with specifications Supporting continuous improvement by investigating alternatives and technologies and presenting these for architectural review ### What skills do I need? Great knowledge of Python Great knowledge of SQL & DBMS Apache Iceberg Great knowledge of distributed processing (Apache Spark Hadoop Hive Presto or similar) Deep understanding of data modelling (Star schema Snowflake) and manipulation/cleansing Good knowledge of non-relational databases (NoSQL) and semi-structured/unstructured data Experience with AWS environment (S3 Redshift RDS SQS Athena Glue CloudWatch EMR Lambda or similar) Experience with code versioning (GitHub or similar) Experience in Batch processing (ETL/ELT) Advanced/fluent English What do we offer? Besides the tailored benefits we have for each country dLocal will help you thrive and go that extra mile by offering you: Remote work: work from anywhere or one of our offices around the globe! Flexibility: we have flexible schedules and we are driven by performance. Fintech industry: work in a dynamic and ever-evolving environment with plenty to build and boost your creativity. - Referral bonus program: our internal talents are the best recruiters - refer someone ideal for a role and get rewarded. Learning & development: get access to a Premium Coursera subscription. Language classes: we provide free English Spanish or Portuguese classes. Social budget: you'll get a monthly budget to chill out with your team (in person or remotely) and deepen your connections! dLocal Houses: want to rent a house to spend one week anywhere in the world coworking with your team? We’ve got your back! For people based in Montevideo (Uruguay) applying to non-IT roles 55% monthly attendance to the office is required What happens after you apply? Our Talent Acquisition team is invested in creating the best candidate experience possible so don’t worry you will definitely hear from us. We will review your CV and keep you posted by email at every step of the process! Also you can check out our webpage Linkedin Instagram and Youtube for more about dLocal!
6 hour(s) ago
View
Software Engineer, Backend
StackAdapt
Canada, NC
StackAdapt is the leading technology company that empowers marketers to reach engage and convert audiences with precision. With 465 billion automated optimizations per second the AI-powered StackAdapt Marketing Platform seamlessly connects brand and performance marketing to drive measurable results across the entire customer journey. The most forward-thinking marketers choose StackAdapt to orchestrate high-impact campaigns across programmatic advertising and marketing channels. StackAdapt is the no. 1 performing programmatic advertising platform helping brands accelerate customer engagement and acquisition. This state-of-the-art platform is where some of the most progressive work in machine learning meets cutting-edge user experience. Ranking the highest in performance by G2 Crowd for the fourth time we're one of the fastest growing companies in Canada and ranks 6th in Deloitte's Technology Fast 50 ranking and 23rd in Fast 500 in North America. Our real-time advertising bidding system handles over 3000000 requests per second and stores several terabytes of data every day. Our technologies include Go Ruby on Rails Aerospike Redis Elasticsearch Kafka RocksDB Redshift ScyllaDB GraphQL and others. We're not afraid to test and try new technologies. Watch our talk at Amazon Tech Talks: https://www.youtube.com/watch?v=lRqu-a4gPuU StackAdapt is a Remote First company we are open to candidates located anywhere in Canada or the United States for this position. What You'll Be Doing Building highly scalable distributed real-time micro-services primarily written in Go Working with large data sets and various databases including Aerospike Elasticsearch Redis ScyllaDB Redshift TiDB MariaDB Build software that utilize messaging queues such as Kafka SQS and Kinesis Write performance efficient and memory optimized code We'll Be Reaching Out To Candidates That Have 2+ years of experience as a Backend Software Engineer. Very strong problem solving skills in data structures algorithms and optimization Experience working with relational databases and/or key-value stores Experience building scalable micro-services and distributed systems StackAdapter's Enjoy Highly competitive salary + commission structure RRSP/401K matching 3 weeks vacation + 3 personal care days + 1 Culture & Belief day + birthdays off Access to a comprehensive mental health care platform Health benefits from day one of employment Work from home reimbursements Optional global WeWork membership for those who want a change from their home office Robust training and onboarding program Coverage and support of personal development initiatives (conferences courses etc) Access to StackAdapt programmatic courses and certifications to support continuous learning An awesome parental leave policy A friendly welcoming and supportive culture Our social and team events! StackAdapt is a diverse and inclusive team of collaborative hardworking individuals trying to make a dent in the universe. No matter who you are where you are from who you love follow in faith disability (or superpower) status ethnicity or the gender you identify with (if you’re comfortable let us know your pronouns) you are welcome at StackAdapt. If you have any requests or requirements to support you throughout any part of the interview process please let our Talent team know. About StackAdapt We've been recognized for our diverse and supportive workplace high performing campaigns award-winning customer service and innovation. We've been awarded: Ad Age Best Places to Work 2024 G2 Top Software and Top Marketing and Advertising Product for 2024 Campaign’s Best Places to Work 2023 for the UK 2024 Best Workplaces for Women and in Canada by Great Place to Work #1 DSP on G2 and leader in a number of categories including Cross-Channel Advertising
12 hour(s) ago
View
Senior Reporting Engineer
dLocal
Remote
Why should you join dLocal? dLocal enables the biggest companies in the world to collect payments in 40 countries in emerging markets. Global brands rely on us to increase conversion rates and simplify payment expansion effortlessly. As both a payments processor and a merchant of record where we operate we make it possible for our merchants to make inroads into the world’s fastest-growing emerging markets. By joining us you will be a part of an amazing global team that makes it all happen in a flexible remote-first dynamic culture with travel health and learning benefits among others. Being a part of dLocal means working with 1000+ teammates from 30+ different nationalities and developing an international career that impacts millions of people’s daily lives. We are builders we never run from a challenge we are customer-centric and if this sounds like you we know you will thrive in our team. We are looking for a Software Engineer who wants to build high-performance scalable enterprise-grade applications. You'll be part of a talented software team working on apps to deliver insights to big clients like Netflix Amazon Nike Facebook and more! ### What will I be doing? Contributing in all phases of the analytical application development life cycle Designing developing and delivering high-volume applications for data analytics systems Writing well-designed testable and efficient code Ensuring designs are in compliance with specifications Supporting continuous improvement by investigating alternatives and technologies and presenting these for architectural review ### What skills do I need? Great knowledge of Python Great knowledge of SQL & DBMS Apache Iceberg Great knowledge of distributed processing (Apache Spark Hadoop Hive Presto or similar) Deep understanding of data modelling (Star schema Snowflake) and manipulation/cleansing Good knowledge of non-relational databases (NoSQL) and semi-structured/unstructured data Experience with AWS environment (S3 Redshift RDS SQS Athena Glue CloudWatch EMR Lambda or similar) Experience with code versioning (GitHub or similar) Experience in Batch processing (ETL/ELT) Advanced/fluent English What do we offer? Besides the tailored benefits we have for each country dLocal will help you thrive and go that extra mile by offering you: Remote work: work from anywhere or one of our offices around the globe! Flexibility: we have flexible schedules and we are driven by performance. Fintech industry: work in a dynamic and ever-evolving environment with plenty to build and boost your creativity. - Referral bonus program: our internal talents are the best recruiters - refer someone ideal for a role and get rewarded. Learning & development: get access to a Premium Coursera subscription. Language classes: we provide free English Spanish or Portuguese classes. Social budget: you'll get a monthly budget to chill out with your team (in person or remotely) and deepen your connections! dLocal Houses: want to rent a house to spend one week anywhere in the world coworking with your team? We’ve got your back! For people based in Montevideo (Uruguay) applying to non-IT roles 55% monthly attendance to the office is required What happens after you apply? Our Talent Acquisition team is invested in creating the best candidate experience possible so don’t worry you will definitely hear from us. We will review your CV and keep you posted by email at every step of the process! Also you can check out our webpage Linkedin Instagram and Youtube for more about dLocal!
23 hour(s) ago
View
SPEC DATABASE ADMINISTRATOR
Solventum
Gaithersburg, MD
Thank you for your interest in joining Solventum. Solventum is a new healthcare company with a long legacy of solving big challenges that improve lives and help healthcare professionals perform at their best. At Solventum people are at the heart of every innovation we pursue. Guided by empathy insight and clinical intelligence we collaborate with the best minds in healthcare to address our customers’ toughest challenges. While we continue updating the Solventum Careers Page and applicant materials some documents may still reflect legacy branding. Please note that all listed roles are Solventum positions and our Privacy Policy: https://www.solventum.com/en-us/home/legal/website-privacy-statement/applicant-privacy/ applies to any personal information you submit. As it was with 3M at Solventum all qualified applicants will receive consideration for employment without regard to their race color religion sex sexual orientation gender identity national origin disability or status as a protected veteran.Job Description: At Solventum we enable better smarter safer healthcare to improve lives. As a new company with a long legacy of creating breakthrough solutions for our customers’ toughest challenges we pioneer game-changing innovations at the intersection of health material and data science that change patients' lives for the better while enabling healthcare professionals to perform at their best. Because people and their wellbeing are at the heart of every scientific advancement we pursue. We partner closely with the brightest minds in healthcare to ensure that every solution we create melds the latest technology with compassion and empathy. Because at Solventum we never stop solving for you. The Impact You’ll Make in this Role Solventum is seeking a highly motivated Senior Cloud Data Engineer to help shape the future of global health informatics. In this role you’ll collaborate with cross-functional teams to design and support our Enterprise Data Mesh enabling data-driven healthcare innovations on a global scale. You will work closely with SaaS application teams to build robust and scalable data solutions used by a wide network of healthcare providers. Key Responsibilities Design and build scalable efficient and fault-tolerant data operations with structured and unstructured data systems. Develop Build and maintain AWS data platform services such as S3 RDS Aurora Redshift PostgreSQL DynamoDB using cloud deployment pipeline. Develop and maintain robust data pipelines for ingesting transforming and distributing data streams. Partner with senior technical leads analysts engineers and scientists to implement scalable data initiatives using AWS cloud services. Implement advanced data architectures for accelerated solution design including data integration modeling governance and applications. Champion best practices in data engineering standards. Mentor and guide internal teams on database standards pipeline development and efficient data consumption. Implement and maintain database security measures including user access controls and data encryption. Manage healthcare data and maintain HIPAA SOC2 FedRamp StateRAMP data security controls. Qualifications: Bachelor’s degree (or higher) in Computer Science or a related field from an accredited institution. Minimum 8 years of solid experience deploying and managing cloud data platforms on AWS. Advanced proficiency in SQL. Deep experience with AWS S3 AWS RDS Aurora PostgreSQL and MSSQL. Additional qualifications that could help you succeed even further in this role include: Proficiency with both SQL and NoSQL databases (e.g. PostgreSQL Redshift DynamoDB). Expereince in cloud-native architectures and distributed computing with AWS tools like Glue and EMR. Expertise in Python PySpark Scala and advanced SQL techniques. Familiarity with GraphQL. Experience with Docker Kubernetes and microservice-based Data API development. Experience with stream processing tools like Amazon Kinesis Apache Spark Storm or Kafka. Agile development experience proficiency with tools like JIRA and Confluence. Strong collaboration and communication skills. AWS professional certificate(s) Work location: Remote Travel: May include up to 15% domestic/international Relocation Assistance: Not authorized Must be legally authorized to work in country of employment without sponsorship for employment visa status (e.g. H1B status). Supporting Your Well-being Onboarding Requirement: To improve the onboarding experience you will have an opportunity to meet with your manager and other new employees as part of the Solventum new employee orientation. As a result new employees hired for this position will be required to travel to a designated company location for on-site onboarding during their initial days of employment. Travel arrangements and related expenses will be coordinated and paid for by the company in accordance with its travel policy. Applies to new hires with a start date of October 1st 2025 or later. Applicable to US Applicants Only:The expected compensation range for this position is $137439 - $167981 which includes base pay plus variable incentive pay if eligible. This range represents a good faith estimate for this position. The specific compensation offered to a candidate may vary based on factors including but not limited to the candidate’s relevant knowledge training skills work location and/or experience. In addition this position may be eligible for a range of benefits (e.g. Medical Dental & Vision Health Savings Accounts Health Care & Dependent Care Flexible Spending Accounts Disability Benefits Life Insurance Voluntary Benefits Paid Absences and Retirement Benefits etc.). Additional information is available at: https://www.solventum.com/en-us/home/our-company/careers/#Total-Rewards Responsibilities of this position include that corporate policies procedures and security standards are complied with while performing assigned duties. Solventum is committed to maintaining the highest standards of integrity and professionalism in our recruitment process. Applicants must remain alert to fraudulent job postings and recruitment schemes that falsely claim to represent Solventum and seek to exploit job seekers. Please note that all email communications from Solventum regarding job opportunities with the company will be from an email with a domain of @solventum.com. Be wary of unsolicited emails or messages regarding Solventum job opportunities from emails with other email domains. Please note Solventum does not expect candidates in this position to perform work in the unincorporated areas of Los Angeles County. Solventum is an equal opportunity employer. Solventum will not discriminate against any applicant for employment on the basis of race color religion sex sexual orientation gender identity national origin age disability or veteran status.Please note: your application may not be considered if you do not provide your education and work history either by: 1) uploading a resume or 2) entering the information into the application fields directly. Solventum Global Terms of Use and Privacy Statement Carefully read these Terms of Use before using this website. Your access to and use of this website and application for a job at Solventum are conditioned on your acceptance and compliance with these terms. Please access the linked document select the country where you are applying for employment and review. Before submitting your application you will be asked to confirm your agreement with the terms.
1 day(s) ago
View
Senior Reporting Engineer
DLOCAL
Remote Argentina
Why should you join dLocal? dLocal enables the biggest companies in the world to collect payments in 40 countries in emerging markets. Global brands rely on us to increase conversion rates and simplify payment expansion effortlessly. As both a payments processor and a merchant of record where we operate we make it possible for our merchants to make inroads into the world’s fastest-growing emerging markets. By joining us you will be a part of an amazing global team that makes it all happen in a flexible remote-first dynamic culture with travel health and learning benefits among others. Being a part of dLocal means working with 1000+ teammates from 30+ different nationalities and developing an international career that impacts millions of people’s daily lives. We are builders we never run from a challenge we are customer-centric and if this sounds like you we know you will thrive in our team. We are looking for a Software Engineer who wants to build high-performance scalable enterprise-grade applications. You'll be part of a talented software team working on apps to deliver insights to big clients like Netflix Amazon Nike Facebook and more! ### What will I be doing? Contributing in all phases of the analytical application development life cycle Designing developing and delivering high-volume applications for data analytics systems Writing well-designed testable and efficient code Ensuring designs are in compliance with specifications Supporting continuous improvement by investigating alternatives and technologies and presenting these for architectural review ### What skills do I need? Great knowledge of Python Great knowledge of SQL & DBMS Apache Iceberg Great knowledge of distributed processing (Apache Spark Hadoop Hive Presto or similar) Deep understanding of data modelling (Star schema Snowflake) and manipulation/cleansing Good knowledge of non-relational databases (NoSQL) and semi-structured/unstructured data Experience with AWS environment (S3 Redshift RDS SQS Athena Glue CloudWatch EMR Lambda or similar) Experience with code versioning (GitHub or similar) Experience in Batch processing (ETL/ELT) Advanced/fluent English What do we offer? Besides the tailored benefits we have for each country dLocal will help you thrive and go that extra mile by offering you: Remote work: work from anywhere or one of our offices around the globe! Flexibility: we have flexible schedules and we are driven by performance. Fintech industry: work in a dynamic and ever-evolving environment with plenty to build and boost your creativity. - Referral bonus program: our internal talents are the best recruiters - refer someone ideal for a role and get rewarded. Learning & development: get access to a Premium Coursera subscription. Language classes: we provide free English Spanish or Portuguese classes. Social budget: you'll get a monthly budget to chill out with your team (in person or remotely) and deepen your connections! dLocal Houses: want to rent a house to spend one week anywhere in the world coworking with your team? We’ve got your back! For people based in Montevideo (Uruguay) applying to non-IT roles 55% monthly attendance to the office is required What happens after you apply? Our Talent Acquisition team is invested in creating the best candidate experience possible so don’t worry you will definitely hear from us. We will review your CV and keep you posted by email at every step of the process! Also you can check out our webpage Linkedin Instagram and Youtube for more about dLocal!
1 day(s) ago
View
Clinical Data Architect
WRS Health
Remote Argentina
Company Overview Voted #1 EHR by PC Mag WRS Health delivers a fully integrated cloud based EMR and practice management solutions to its clients. We bring solutions to physicians by providing constant enhancement of our products and services including EHR practice management marketing patient coordination and billing. Job Purpose and Role WRS Health is seeking a hands-on and strategic Director of Data Architecture to lead our data platform initiatives on Amazon Web Services (AWS) with a strong emphasis on healthcare data data lake strategy and AI enablement. You will architect and manage scalable secure and high-performance data systems that support both traditional analytics and modern ML workloads including embedding models and vectorized data retrieval. Key Responsibilities: Strategic Data Platform Leadership Define and implement an enterprise-wide data architecture strategy that supports interoperability AI/ML readiness and regulatory compliance. Lead the evolution of our AWS-based data lake architecture supporting structured semi-structured and unstructured data types—especially FHIR-formatted JSON healthcare data. Cloud Data Lake & Storage Optimization Design and maintain scalable secure and cost-effective data lakes using Amazon S3 AWS Glue Athena Redshift and Lake Formation. Leverage Mountpoint for S3 to enable high-performance POSIX-compliant access to S3 objects including vectorized data files. Optimize data storage and retrieval strategies for performance and cost-efficiency including partitioning file formats (e.g. Parquet ORC) and compression techniques. AI/ML Enablement and Vector Infrastructure Collaborate with data science teams to implement embedding models vectorization pipelines and real-time inference architectures. Design and manage vector storage systems (e.g. S3-based FAISS Pinecone or Amazon OpenSearch) to support semantic search retrieval-augmented generation (RAG) and intelligent data access. Ensure vectorized data pipelines are aligned with model training evaluation and deployment strategies. Healthcare Data Architecture & Interoperability Architect systems to ingest process and store FHIR-compliant JSON data from EHRs APIs and HL7 sources. Ensure conformance with healthcare interoperability standards and optimize for queryability and downstream analytics. Implement data normalization and enrichment pipelines for use in both clinical and operational contexts. Security Compliance & Governance Lead efforts to ensure data security at rest and in transit using AWS-native encryption IAM VPC controls and bucket policies. Implement and manage data access controls audit logging and role-based security models across AWS environments. Oversee data governance including lineage cataloging and stewardship with tools such as AWS Glue Data Catalog Lake Formation or third-party platforms. Team Leadership & Cross-Functional Collaboration Build and lead a high-performing team of data architects and engineers. Work closely with stakeholders from engineering data science product and compliance teams to deliver data initiatives. Promote data literacy and foster a culture of innovation and continuous improvement. Qualifications: Bachelor’s or Master’s in Computer Science Data Engineering or related field. 8–12+ years of experience in data architecture with 3–5 years in a technical leadership role. Proven experience architecting AWS-based data lakes and analytics pipelines. Deep understanding of healthcare data standards (FHIR HL7) and working with FHIR JSON objects in large-scale systems. Expertise with embedding and vectorization models semantic search and managing vector storage solutions. Hands-on experience with Amazon S3 Mountpoint for S3 and optimizing S3-based workloads for performance and cost. Strong background in data security encryption access control and compliance frameworks (HIPAA HITRUST). Preferred Qualifications AWS certifications (e.g. AWS Certified Big Data or Data Analytics – Specialty). Familiarity with open-source vector databases (e.g. FAISS Weaviate) and MLOps pipelines. Experience in clinical systems integration claims processing or population health analytics. This is an independent contractor position. Job Type: Full-time Location: Remote Hours: Available during standard US business hours (9am-5pm EST or 8:30am-4:30pm EST) This job description is intended to describe the general requirements for the position. It is not a complete statement of duties responsibilities or requirements. Other duties not listed here may be assigned as necessary to ensure proper operations of the department. X5cMmr6ibi
1 day(s) ago
View
Senior Reporting Engineer
dlocal
Brazil (Remote) / Montevideo (Remote) / Argentina (Remote)
Why should you join dLocal?dLocal enables the biggest companies in the world to collect payments in 40 countries in emerging markets. Global brands rely on us to increase conversion rates and simplify payment expansion effortlessly. As both a payments processor and a merchant of record where we operate we make it possible for our merchants to make inroads into the world’s fastest-growing emerging markets. By joining us you will be a part of an amazing global team that makes it all happen in a flexible remote-first dynamic culture with travel health and learning benefits among others. Being a part of dLocal means working with 1000+ teammates from 30+ different nationalities and developing an international career that impacts millions of people’s daily lives. We are builders we never run from a challenge we are customer-centric and if this sounds like you we know you will thrive in our team.We are looking for a Software Engineer who wants to build high-performance scalable enterprise-grade applications. You'll be part of a talented software team working on apps to deliver insights to big clients like Netflix Amazon Nike Facebook and more! ➡ What will I be doing? ➡ Contributing in all phases of the analytical application development life cycleDesigning developing and delivering high-volume applications for data analytics systemsWriting well-designed testable and efficient codeEnsuring designs are in compliance with specificationsSupporting continuous improvement by investigating alternatives and technologies and presenting these for architectural review What skills do I need? ➡ Great knowledge of PythonGreat knowledge of SQL & DBMS Apache IcebergGreat knowledge of distributed processing (Apache Spark Hadoop Hive Presto or similar)Deep understanding of data modelling (Star schema Snowflake) and manipulation/cleansingGood knowledge of non-relational databases (NoSQL) and semi-structured/unstructured dataExperience with AWS environment (S3 Redshift RDS SQS Athena Glue CloudWatch EMR Lambda or similar)Experience with code versioning (GitHub or similar)Experience in Batch processing (ETL/ELT)Advanced/fluent English ➡ What do we offer? Besides the tailored benefits we have for each country dLocal will help you thrive and go that extra mile by offering you:- Remote work: work from anywhere or one of our offices around the globe!- Flexibility: we have flexible schedules and we are driven by performance.- Fintech industry: work in a dynamic and ever-evolving environment with plenty to build and boost your creativity.- Referral bonus program: our internal talents are the best recruiters - refer someone ideal for a role and get rewarded.- Learning & development: get access to a Premium Coursera subscription.- Language classes: we provide free English Spanish or Portuguese classes.- Social budget: you'll get a monthly budget to chill out with your team (in person or remotely) and deepen your connections!- dLocal Houses: want to rent a house to spend one week anywhere in the world coworking with your team? We’ve got your back!For people based in Montevideo (Uruguay) applying to non-IT roles 55% monthly attendance to the office is requiredWhat happens after you apply?Our Talent Acquisition team is invested in creating the best candidate experience possible so don’t worry you will definitely hear from us. We will review your CV and keep you posted by email at every step of the process!Also you can check out our webpage Linkedin Instagram and Youtube for more about dLocal!
1 day(s) ago
View
Senior Reporting Engineer
DLOCAL
Remote Brazil
Why should you join dLocal? dLocal enables the biggest companies in the world to collect payments in 40 countries in emerging markets. Global brands rely on us to increase conversion rates and simplify payment expansion effortlessly. As both a payments processor and a merchant of record where we operate we make it possible for our merchants to make inroads into the world’s fastest-growing emerging markets. By joining us you will be a part of an amazing global team that makes it all happen in a flexible remote-first dynamic culture with travel health and learning benefits among others. Being a part of dLocal means working with 1000+ teammates from 30+ different nationalities and developing an international career that impacts millions of people’s daily lives. We are builders we never run from a challenge we are customer-centric and if this sounds like you we know you will thrive in our team. We are looking for a Software Engineer who wants to build high-performance scalable enterprise-grade applications. You'll be part of a talented software team working on apps to deliver insights to big clients like Netflix Amazon Nike Facebook and more! ### What will I be doing? Contributing in all phases of the analytical application development life cycle Designing developing and delivering high-volume applications for data analytics systems Writing well-designed testable and efficient code Ensuring designs are in compliance with specifications Supporting continuous improvement by investigating alternatives and technologies and presenting these for architectural review ### What skills do I need? Great knowledge over Python and Java or Golang Strong experience in developing REST API applications Great knowledge over SQL & DBMS Deep understanding on data modelling (Star schema Snowflake) and manipulation/cleansing Good knowledge on non-relational databases (NoSQL) and semi-structured/unstructured data Good knowledge of distributed processing (Apache Spark Hadoop Hive Presto or similar) Experience with AWS environment (S3 Redshift RDS SQS Athena Glue CloudWatch EMR Lambda or similar) Experience with code versioning (GitHub or similar) Experience in Batch processing (ETL/ELT) Advanced/fluent English Desirable skills: Experience with Agile/Kanban work methodology (JIRA) Experience using Unix OS Experience with dataviz tools (Tableau Looker DataStudio PowerBI or similar) Understanding of files formats and how to manipulate them (AVRO JSON PARQUET CSV etc) Knowledge in GCP Knowledge in Streaming Knowledge on orchestration tools (Apache Airflow Prefect Mage or similar) What do we offer? Besides the tailored benefits we have for each country dLocal will help you thrive and go that extra mile by offering you: Remote work: work from anywhere or one of our offices around the globe! Flexibility: we have flexible schedules and we are driven by performance. Fintech industry: work in a dynamic and ever-evolving environment with plenty to build and boost your creativity. - Referral bonus program: our internal talents are the best recruiters - refer someone ideal for a role and get rewarded. Learning & development: get access to a Premium Coursera subscription. Language classes: we provide free English Spanish or Portuguese classes. Social budget: you'll get a monthly budget to chill out with your team (in person or remotely) and deepen your connections! dLocal Houses: want to rent a house to spend one week anywhere in the world coworking with your team? We’ve got your back! For people based in Montevideo (Uruguay) applying to non-IT roles 55% monthly attendance to the office is required What happens after you apply? Our Talent Acquisition team is invested in creating the best candidate experience possible so don’t worry you will definitely hear from us. We will review your CV and keep you posted by email at every step of the process! Also you can check out our webpage Linkedin Instagram and Youtube for more about dLocal!
1 day(s) ago
View
Lead Data Architect
WRS Health
Remote Indonesia
Company Overview Voted #1 EHR by PC Mag WRS Health delivers a fully integrated cloud based EMR and practice management solutions to its clients. We bring solutions to physicians by providing constant enhancement of our products and services including EHR practice management marketing patient coordination and billing. Job Purpose and Role WRS Health is seeking a hands-on and strategic Director of Data Architecture to lead our data platform initiatives on Amazon Web Services (AWS) with a strong emphasis on healthcare data data lake strategy and AI enablement. You will architect and manage scalable secure and high-performance data systems that support both traditional analytics and modern ML workloads including embedding models and vectorized data retrieval. Key Responsibilities: Strategic Data Platform Leadership Define and implement an enterprise-wide data architecture strategy that supports interoperability AI/ML readiness and regulatory compliance. Lead the evolution of our AWS-based data lake architecture supporting structured semi-structured and unstructured data types—especially FHIR-formatted JSON healthcare data. Cloud Data Lake & Storage Optimization Design and maintain scalable secure and cost-effective data lakes using Amazon S3 AWS Glue Athena Redshift and Lake Formation. Leverage Mountpoint for S3 to enable high-performance POSIX-compliant access to S3 objects including vectorized data files. Optimize data storage and retrieval strategies for performance and cost-efficiency including partitioning file formats (e.g. Parquet ORC) and compression techniques. AI/ML Enablement and Vector Infrastructure Collaborate with data science teams to implement embedding models vectorization pipelines and real-time inference architectures. Design and manage vector storage systems (e.g. S3-based FAISS Pinecone or Amazon OpenSearch) to support semantic search retrieval-augmented generation (RAG) and intelligent data access. Ensure vectorized data pipelines are aligned with model training evaluation and deployment strategies. Healthcare Data Architecture & Interoperability Architect systems to ingest process and store FHIR-compliant JSON data from EHRs APIs and HL7 sources. Ensure conformance with healthcare interoperability standards and optimize for queryability and downstream analytics. Implement data normalization and enrichment pipelines for use in both clinical and operational contexts. Security Compliance & Governance Lead efforts to ensure data security at rest and in transit using AWS-native encryption IAM VPC controls and bucket policies. Implement and manage data access controls audit logging and role-based security models across AWS environments. Oversee data governance including lineage cataloging and stewardship with tools such as AWS Glue Data Catalog Lake Formation or third-party platforms. Team Leadership & Cross-Functional Collaboration Build and lead a high-performing team of data architects and engineers. Work closely with stakeholders from engineering data science product and compliance teams to deliver data initiatives. Promote data literacy and foster a culture of innovation and continuous improvement. Qualifications: Bachelor’s or Master’s in Computer Science Data Engineering or related field. 8–12+ years of experience in data architecture with 3–5 years in a technical leadership role. Proven experience architecting AWS-based data lakes and analytics pipelines. Deep understanding of healthcare data standards (FHIR HL7) and working with FHIR JSON objects in large-scale systems. Expertise with embedding and vectorization models semantic search and managing vector storage solutions. Hands-on experience with Amazon S3 Mountpoint for S3 and optimizing S3-based workloads for performance and cost. Strong background in data security encryption access control and compliance frameworks (HIPAA HITRUST). Preferred Qualifications AWS certifications (e.g. AWS Certified Big Data or Data Analytics – Specialty). Familiarity with open-source vector databases (e.g. FAISS Weaviate) and MLOps pipelines. Experience in clinical systems integration claims processing or population health analytics. This is an independent contractor position. Job Type: Full-time Location: Remote Hours: Available during standard US business hours (9am-5pm EST or 8:30am-4:30pm EST) This job description is intended to describe the general requirements for the position. It is not a complete statement of duties responsibilities or requirements. Other duties not listed here may be assigned as necessary to ensure proper operations of the department. KnnRUz9kkt
1 day(s) ago
View

* unlock: sign-up for free / login and use the searches from your home page
** job listings updated in real time 🔥

Login & search by other job titles, a specific location or any keyword.
Powerful custom searches are available once you login.