Job Title | Location | Description | Last Seen & URL |
---|---|---|---|
Data Analyst (m/w/d) - Remote möglich
Interlead GmbH |
Remote Germany
|
Hi wir sind Interlead! Bist du datengetrieben und hast Freude daran komplexe Zahlen in klare Handlungsanweisungen zu verwandeln? Dann suchen wir dich als Data Analyst (m/w/d) [in unserem Bremer Office / zu 100 % im Home-Office]! Aufgaben Du erstellst aussagekräftige und nutzerfreundliche Dashboards die unseren Fachbereichen helfen datenbasierte Entscheidungen zu treffen Du entwickelst und pflegst leichtgewichtige ETL-Pipelines die unsere Datenverarbeitung effizient gestalten Du führst Ad-hoc-Analysen durch um die datengetriebenen Entscheidungsprozesse in allen Geschäftsbereichen zu unterstützen Du erstellst Prognosen die eine gezielte Tagessteuerung ermöglichen und zur Optimierung beitragen Qualifikation Mindestens 2 Jahre einschlägige Berufserfahrung als Data Analyst Fundierte Kenntnisse in SQL und Python Eine strukturierte Arbeitsweise und die Fähigkeit Anforderungen verschiedener Stakeholder effektiv zu managen Erfahrung im Umgang mit Datenvisualisierungs-Tools und ETL-Prozessen Benefits Ein dynamisches Marketing- und Tech-Umfeld und die Möglichkeit zusammen “The Lead Company” zu bauen Ein riesiger Markt und viele interne Wachstumsmöglichkeiten da wir das größte Problem unserer Partner lösen: Wir automatisieren die Neukundengewinnung Volle Flexibilität im Arbeitsort (arbeite je nach Position entweder remote oder aus unserem Headquarter in der Bremer Überseestadt auch Arbeiten aus dem Ausland ist nach Absprache möglich) Volle Flexibilität in der Arbeitszeit (die du individuell mit deinem Team besprichst) 30 Tage Urlaub im Jahr und jederzeit die Möglichkeit unbezahlt Urlaub zu nehmen Wir bezuschussen deine betriebliche Altersvorsorge (die “Interlead-Rente”) Jedes Jahr großartige Events (Sommerfest & Weihnachtsfeier) Wir bezuschussen dein Fitnessstudio (über Urban Sports Club oder E-Gym Wellpass) Schnelle Entscheidungen da wir 100% bootstrapped profitabel und somit nicht von Investoren abhängig sind Eine offene Feedback-Kultur und internes Führungskräfte-Training Ein monatliches All-Hands-Meeting wöchentlicher Newsletter und natürlich deine regulären Team-Meetings damit du immer bestens informiert bleibst Für alle Office-Mitarbeiter bieten wir in unserem Headquarter in der Bremer Überseestadt alles was das Herz begehrt (ergonomische Arbeitsplätze tollen Kaffee und Espresso Getränke Grillabende etc) Für alle Home-Office-Mitarbeitenden beteiligen wir uns mit 20 EUR Home-Office-Pauschale pro Monat Einen Empfehlungsbonus von 1.000 € brutto für jedes neue Teammitglied dass du uns vorschlägst (nach bestandener Probezeit) Ein ansprechendes Onboarding-Package sowie ein umfassendes Onboarding durch das gesamte Unternehmen Eine interne Academy zur Weiterbildung und als Wissensdatenbank die du jederzeit von überall aufrufen und nutzen kannst Hunde sind bei uns jederzeit willkommen! Wir freuen uns über deine Bewerbung!
|
2025-06-13 04:01
|
Data Analyst (100% Remote)
Lensa |
|
Lensa is the leading career site for job seekers at every stage of their career. Our client ClearCaptions is seeking professionals. Apply via Lensa today! Who We Are Since our founding in 2011 our mission has been to improve the lives of seniors and their caregivers. We are deeply passionate about communication and committed to becoming the foremost provider of services and solutions that enable seniors to lead more meaningful and independent lives. We also understand the power of connection and the profound impact it has on the lives of individuals who are hard-of-hearing. By utilizing enhanced automatic speech recognition human captioning and innovative product development we deliver easy-to-use cutting-edge technology to our primarily senior customer base. Our near real-time phone captioning technology allows individuals with hearing loss to see what callers are saying enabling them to regain their connection to the world. ClearCaptions is a Federal Communications Commission (FCC)-certified telephone captioning provider adhering to the highest industry standards of privacy security and professionalism. We recognize the importance of maintaining the trust and confidence of our customers and we continually strive to exceed their expectations. For more information about our services please visit clearcaptions.com . Position Summary Are you passionate about data excited by opportunities for deep-dive analysis and driven by a curiosity to understand cross-functional business analytics? Are you looking for an exciting opportunity with a mission-driven organization dedicated to improving lives? ClearCaptions is seeking a talented results-driven Data Analyst to join our dynamic insights-focused team. In this role you will collaborate with key organizational leaders including the C-suite to deliver cohesive rigorous and insightful reporting tools and in-depth analyses that facilitate data-driven decision making. The ideal candidate will have strong analytical skills the ability to translate data into meaningful and actionable business recommendations that connect to higher-level business objectives and a desire to partner with stakeholders to cultivate a culture of data literacy and democratization. Exceptional candidates will be energized by the opportunity to serve as a thought-partner problem-solver and bridge between technical and non-technical teams. This is a Remote/Work from Home position reporting to the Director of Business Insights. What You Will Do Work collaboratively with internal teams to solve cross-functional business challenges and provide key insights that drive improvement for both our business and our customers. Translate analyses into actionable data-backed recommendations and summary of expected results. Analyze operational initiatives to measure their effectiveness and impact on key performance indicators (KPIs) for both the functional department and the overall business. Enable visibility and in-depth understanding of existing performance recommend optimizations to processes based on a data-driven orientation and analysis of expected res ults. Develop refine and leverage cross-functional dashboards and reports to provide timely data for decision-making as well as regular updates on performance to stakeholders. Collect integrate and interpret cross-functional and disparate date sets for holistic insights and recommendations as well as impact on business value. Evaluate the performance of processes to increase customer acquisition retention and adoption while ensuring we maintain excellence in customer service integrity and efficiency of resources. Adjust and optimize metrics and reporting to develop deeper organizational understanding of year-over-year pacing and progress against goals. Leverage advanced analytics modeling in regression forecasting segmentation or similar for significant and rigorous business and insights. Be a key partner with our internal team members to refine and optimize performance. Support A/B Test design and execution lead analysis of test data and delivery of actionable results. Maintain a detailed understanding of platforms data structures and services to become a reliable source for the organization and contribute to the evolution of analytics capabilities and frameworks. Partner closely with Data Engineering and Data Governance to communicate and collaborate on complex data needs and projects to support business requirements automation initiatives and opportunities. Generate regular and ad hoc reports for leadership and key stakeholders to support the business. Work to continually optimize internal processes to achieve operational excellence. Maintain data dictionaries data catalogs event mapping and documentation of data sources. Ensure data security and compliance with data privacy regulations. The Kind Of People We Look For Versatile people who thrive on variety and challenge Excited about working in a fast-paced environment Innate problem solvers who want to grow in a flexible collaborative culture Takes initiative pushes boundaries motivated to innovate Talented individuals with a growth mindset who want to use their learning and relationship-building skills Align with our company core values: Integrity Accountability Collaboration Service and Quality Qualifications Bachelor's degree in a related field (e.g. Business Economics Statistics Analytics) or equivalent experience. 3-5 years in a role with demonstrated analytical background and expertise (e.g. data analysis business analytics statistical analysis forecasting/demand management). Proficiency in data analysis tools and languages such as SQL DAX Python and/or R with advanced skill in Excel. Strong data visualization skills using Power BI and/or Tableau. Excellent analytical skills with the ability to collect organize analyze and disseminate significant amounts of information with attention to detail and accuracy. Solid understanding of statistical concepts and data analysis techniques including evaluation of ROI and attribution. Demonstrated proficiency across a range of technologies related to data visualization operational platforms Salesforce data and analysis of large and complex data sets. Experience analyzing and reporting data to identify issues trends or opportunities in a meaningful way to a variety of audiences. Experience collaborating with business unit leaders for requirements gathering and refinement of key business needs and questions. Excellent verbal and written communication skills presentation and problem-solving skills. Self-starter with strong organizational and time management skills self-directed and able to handle multiple priorities with demanding timeframes. Ability to work collaboratively with colleagues and staff to create a high-quality results-driven team environment built on trust and respect. Demonstrated ability as a self-starter able to work independently make sound decisions and use discretion. Willingness and ability to occasionally work flexible hours as needed. Proficient in MS Office modern communication tools for virtual teams (i.e. MS Teams SharePoint Jira Slack). Physical Demands Employees may experience the following physical demands for extended periods of time: Sitting standing and walking (95-100%) Keyboarding (40-60%) Viewing computer monitor tablet and cell phone screen requiring close vision (95-100%) Work Environment 100% Remote: Work environment is at home. Compensation $81000 – $102000 /yr prospectively with consideration to experience and geographical location . Please see www.clearcaptions.com/careers for an overview of our generous benefits program. Intrigued to learn more? When you apply for this role your information will be personally reviewed by our talent acquisition team (not by a robot). You can expect to hear back from us if we think there could be a fit and what next steps look like. ClearCaptions is an equal opportunity employer committed to inclusion and diversity. All employment decisions are based on business needs job requirements and individual qualifications without regard to race color religion sex sexual orientation gender identity national origin disability Veteran status or other legally protected characteristics. Disclaimer The above information in this description has been designed to indicate the general nature and level of work performed by employees within this classification. It is not designed to contain or be interpreted as a comprehensive inventory of all duties responsibilities and qualifications required of employees to do this job. CC does not offer sponsorship for work authorization. Candidates must be authorized to work for any employer in the US without a current or future need for Visa sponsorship. Job Details Pay Type Salary
|
2025-06-13 03:20
|
Data analyst (remote)
Netvagas |
Remote Brazil
|
Descrição: Pra tocar nessa banda é importante você dominar: Bom senso de negócios perfil consultor que sabe fazer as perguntas certas quebrar o problema definir hipóteses decidir quais indicadores respondem essas hipóteses Experiência em análises estatísticas em bancos de dados utilizando os métodos e linguagens mais apropriados Conhecimento em ferramentas de BI (Power BI) Conhecimento em SQL Conhecimento de modelagem de dados Conhecimento em Git Se você quiser se diferenciar é bom você ter Conhecimento em Tableau Conhecimento em ferramenta de Data Visualization Looker Conhecimento em R ou Python Experiência com Trade Marketing Cursos de Big Data Data Analytics Espanhol Avançado Inglês Avançado. Como será seu dia a dia? Construir em conjunto com áreas internas o Briefing detalhando as necessidades da análise de dados Coletar compilar analisar dados provenientes de ferramenta de Reconhecimento de Imagem Desenvolver os painéis utilizando a plataforma Power BI Alinhar arquiteturas com requisitos de negócios Identificar maneiras de melhorar a confiabilidade a eficiência e a qualidade dos dados Estar por dentro e entender cada vez mais dos produtos da Involves Promover melhorias na área e nos processos visando o melhor desempenho da área. 2506110202591176043
|
2025-06-13 02:35
|
Data Security Architect (Data Lakehouse) - remote 1 year to start
Global IT Resources |
Remote United States
|
W2 CANDIDATES ONLY NO C2C CANDIDATES WILL BE CONSIDERED AT THIS TIME! Position Title: Data Security Architect (Data LakeHouse)Job Summary: We are seeking a highly skilled Data Security Architect with extensive experience in securing Data Lakehouse environments. The ideal candidate will have a deep understanding of data security principles best practices and the ability to implement robust security frameworks. Experience with Snowflake and other cloud-based data platforms is highly preferred.Key Responsibilities: · Design & Implement Security Frameworks: Develop and implement comprehensive security architectures for Data Lakehouse environments ensuring data integrity confidentiality and availability. · Snowflake Security Expertise: Leverage expertise in Snowflake to design and enforce security policies access controls and data protection mechanisms within the platform. · Data Governance: Collaborate with data governance teams to ensure that data management practices comply with regulatory requirements and industry standards. · Threat Modeling & Risk Assessment: Conduct threat modeling risk assessments and security reviews to identify vulnerabilities and implement appropriate countermeasures. · Access Management: Design and implement role-based access controls (RBAC) ensuring that access to sensitive data is restricted based on the principle of least privilege. · Encryption & Data Masking: Implement and manage encryption standards and data masking techniques to protect sensitive information in transit and at rest. Secure PII Data: Design and engineer security safeguards for PII data Security policies: Define create implement and maintain corporate security policies and procedures Pattern Development: Use Security best practices develop security and data protection patterns working with Enterprise Architecture for pattern standardization · Collaboration: Work closely with IT data engineering and analytics teams to ensure security requirements are integrated into data pipelines and data storage solutions. · Compliance & Audits: Ensure compliance with relevant data protection regulations (e.g. GDPR CCPA) and participate in internal and external audits. · Continuous Improvement: Stay current with emerging security trends threats and technologies. Continuously refine security architectures to address evolving risks. Qualifications: Education: Bachelor's degree in Computer Science Information Security or a related field. A Master’s degree or professional certifications such as CISSP CISM or SANS GIAC are preferred. Experience: Minimum of 5 years of experience in application security and DevSecOps roles or a related field with a proven track record of developing and managing security architectures for complex applications. · Technical Skills: o Deep understanding of data security principles encryption techniques and secure data storage practices. o Hands-on experience with Snowflake security features including access controls encryption and data masking. o Familiarity with other cloud-based data platforms (e.g. AWS Azure Google Cloud). o Proficient in security frameworks such as NIST ISO 27001 and CIS Controls. Communication Skills: Excellent verbal and written communication skills with the ability to communicate complex security concepts to a variety of audiences including technical and non-technical stakeholders. Problem-Solving Skills: Strong analytical and problem-solving skills with the ability to handle complex security issues and quickly adapt to changing environments. Leadership: Demonstrated leadership abilities with the capacity to lead cross-functional teams and drive projects to completion. Self Management: Ability to effectively prioritize and execute tasks in a high-pressure environment. Agile methodologies: Familiarity of SAFe Agile Methodologies Job Types: Full-time Contract Pay: $60.00 - $70.00 per hour Benefits: 401(k) Dental insurance Flexible schedule Health insurance Paid time off Vision insurance Schedule: Monday to Friday Weekends as needed Application Question(s): Can you work on a W2 Status for the client? No C2C candidates will be considered at this time. Education: Bachelor's (Required) Experience: Date Lakehouse Environment: 5 years (Required) Snowflake Security: 5 years (Required) Data Governance: 5 years (Required) Threat Modeling & Risk Assessment: 5 years (Required) Cloud based date platform: 5 years (Required) License/Certification: Certifications such as CISSP CISM or SANS GIAC (Preferred) Work Location: Remote
|
2025-06-13 01:39
|
Sr AI Data Engineer (Remote )
Cognisol |
|
Title: Sr AI Data Engineer Location: Pittsburgh PA (Remote) Duration : 6+ Months 1099 Rate : $57/Hr W2 Rate : $47/ Hr Job Description We are seeking a highly skilled AI Data Engineer to join our team for Project Acuity. The ideal candidate will have a strong background in AI and data engineering with the ability to provide solution architecture for AI use cases. This is a contract position with strict timelines and we are looking for someone who can join the project immediately. Key Responsibilities Design and implement solution architecture for AI use cases in Project Acuity. Develop algorithms to calculate scores based on previous clinical trial performance history and engagement. Categorize medical diagnoses into body systems medications treatments diagnoses and high-level therapeutic areas. Collaborate with cross-functional teams to ensure seamless integration of AI solutions. Ensure data integrity security and compliance with relevant regulations. Optimize data processing workflows for efficiency and scalability. Provide technical leadership and mentorship to junior team members. Qualifications Bachelor's or Master's degree in Computer Science Data Science AI or a related field. Minimum of 5 years of experience as an AI Data Engineer or similar role. Strong knowledge of AI machine learning and data engineering principles. Proficiency in programming languages such as Python R or Java. Experience with cloud platforms (AWS Azure GCP) and big data technologies (Hadoop Spark). Familiarity with healthcare data standards and regulations (e.g. HIPAA). Excellent problem-solving skills and attention to detail. Strong communication and collaboration skills. Preferred Qualifications Experience in the healthcare or clinical trials industry. Knowledge of natural language processing (NLP) techniques. Certification in AI or data engineering. Skills: rsparkjavaai data engineerazurehadoopnlpsolution architecturepythondata engineeringgcpaimachine learningaws
|
2025-06-13 01:28
|
Senior Data Scientist I - Remote India
outseer |
Remote
|
The data science team at Outseer works to create predictive models that protect people from identity and payment fraud. As a Senior Data Engineer within Outseer you will work on our core risk engine development team collaborating closely with our data sciences team to support and enhance our risk models. Utilizing Java Python and Unix technologies our platform is robust and highly effective. Essential Duties Agile Development: Engage in the full product lifecycle including planning design development testing and project delivery adhering to our standards. Cloud and On-Premises Work: Work with both Azure cloud platform and on-premises environments. Documentation: Create and present Software Requirement Specifications (SRS) and high-level design documents. ETL and Data Pipelines: Design and implement ETL processes and data pipelines for analytics and data warehousing. Application Development: Develop applications and tools to support the development testing and monitoring of analytics pipelines. Proof of Concept (POC): Conduct POC experiments on new ideas and techniques. Data Quality: Perform data quality analysis monitoring and cleaning. Pipeline Support: Support the testing deployment and production activities of data pipelines across multiple platforms. Team Collaboration: Review and provide feedback on the work of other team members. Desired Requirements Bachelors Degree or equivalent in Computer Science. Designing implementing running and monitoring data pipelines and cloud engineering via Terraform Bicep etc. Designing ETL flows for machine learning and analytical workflows. Data stack of at least one cloud provider preferably Azure. SQL Proficiency Pipeline Tools such as Airflow Azure Data Factory Synapse or similar tools. Scientific Python stack including pandas numpy and pyspark. Take ownership of large development tasks and lead them. Desire and ability to learn new techniques and languages to solve new problems. Excellent verbal and written English. Desired Behaviors Adaptability: Demonstrates flexibility and openness to change. Actively seeks and adopts improved approaches and processes. Proactive Action: Takes initiative and is driven by results. Takes ownership of actions and outcomes meeting commitments and striving for high performance. Effective Workload Management: Makes timely decisions prioritizes tasks effectively solves problems monitors results and takes corrective action when necessary. Technical Proficiency: Possesses a solid understanding of their role and responsibilities demonstrating competence in performing tasks and utilizing relevant technical skills. Continuous Learning: Takes personal responsibility for learning and development. Recognizes personal strengths and areas for improvement actively seeks feedback and embraces opportunities to learn. Effective Communication: Demonstrates strong facilitation and written communication skills. Clearly articulates ideas and proposals actively listens to colleagues' perspectives and values diverse viewpoints. Collaboration: Shares information fosters teamwork and contributes to a positive work environment. Actively collaborates with others and encourages a sense of unity and cooperation among team members. Ethical Conduct and Competence: Acts with integrity and intent displaying ethical character in all actions. Takes accountability for one's own behavior and aligns actions with the company's values and principles. Good Citizenship: Represents the values and interests of Outseer. Acts as a positive ambassador for the company and contributes to the overall well-being and success of the organization. Outseer is committed to the principle of equal employment opportunity for all employees and to providing employees with a work environment free of discrimination and harassment. All employment decisions at Outseer are based on business needs job requirements and individual qualifications without regard to race color religion or belief national social or ethnic origin sex (including pregnancy) age physical mental or sensory disability HIV Status sexual orientation gender identity and/or expression marital civil union or domestic partnership status past or present military service family medical history or genetic information family or parental status or any other status protected by the laws or regulations in the locations where we operate. Outseer will not tolerate discrimination or harassment based on any of these characteristics. Outseer encourages applicants of all ages.
|
2025-06-12 23:23
|
Data Scientist (Remote)
voleon |
United States or Remote
|
Voleon is a technology company that applies state-of-the-art AI and machine learning techniques to real-world problems in finance. For more than a decade we have led our industry and worked at the frontier of applying AI/ML to investment management. We have become a multibillion-dollar asset manager and we have ambitious goals for the future. Your colleagues will include internationally recognized experts in artificial intelligence and machine learning research as well as highly experienced finance and technology professionals. In addition to our enriching and collegial working environment we offer highly competitive compensation and benefits packages technology talks by our experts a beautiful modern office daily catered lunches and more.The Voleon Group is forming a new team to help advance our data-driven investment initiatives. As a Data Scientist you will be responsible for harvesting insights from a complex array of data. Your role will involve data curation analysis interpretation visualization and communication of your findings to members of the research staff and executive leadership. This role is a means to make a difference: as a machine learning company data insights are essential to our business. ➡ Responsibilities ➡ Design and implement systems to ensure data correctness and monitor data health in data stores and live feedsProactively identify abnormal production behavior and communicate them clearly to relevant stakeholdersPerform extemporaneous analyses on research and production trading systems with leadership Harness financial expertise and statistical analysis to gain actionable insights into our production trading and research systemsDesign and implement analysis pipelines that automate those analyses found to be valuable for ongoing monitoring Requirements ➡ 1+ years of applied end-to-end industry experience including internships working with complex datasets including curation querying aggregation exploratory data analysis and visualizationExperience using statistical methods to analyze data identify patterns conduct root cause analysis discover insights and recommend solutionsAbility to frame and answer questions mathematicallyAbility to infer useful forward-looking directions from results of retrospective analysisFluency in managing processing and visualizing tabular data using a combination of SQL Pandas and RBasic software development skills and experience with bash linux/unix and gitAbility to refine requirements from ambiguous requests to produce reports demonstrating excellence in communicationBachelor’s degree in a quantitative discipline (statistics biostatistics data science computer science or a related field) Preferred ➡ Master’s degree in a quantitative disciplinePrior industry experience or displayed interest in finance such as related academic projects coursework in financial engineering or industry internships Experience developing in a production-facing environment and familiarity with standard concepts and tooling e.g. CI/CD git Airflow ➡ CompensationThe base salary range for this position is $150000 to $190000 in the location(s) of this posting. Individual salaries are determined through a variety of factors including but not limited to education experience knowledge skills and geography. Base salary does not include other forms of total compensation such as bonus compensation and other benefits. Our benefits package includes medical dental and vision coverage life and AD&D insurance 20 days of paid time off 9 sick days and a 401(k) plan with a company match.“Friends of Voleon” Candidate Referral ProgramIf you have a great candidate in mind for this role and would like to have the potential to earn $7500 if your referred candidate is successfully hired and employed by The Voleon Group please use this form to submit your referral. For more details regarding eligibility terms and conditions please make sure to review the Voleon Referral Bonus Program. Equal Opportunity EmployerThe Voleon Group is an Equal Opportunity employer. Applicants are considered without regard to race color religion creed national origin age sex gender marital status sexual orientation and identity genetic information veteran status citizenship or any other factors prohibited by local state or federal law. Vaccination RequirementThe Voleon Group has implemented a policy requiring all employees who will be entering our worksite including new hires to be fully vaccinated with the COVID-19 vaccine. This policy also applies to remote employees as such employees will be asked to visit our offices from time to time. To the extent permitted by applicable law proof of vaccination will be required as a condition of employment. This policy is part of Voleon’s ongoing efforts to ensure the safety and well-being of our employees and community and to support public health efforts.#LI-MD1
|
2025-06-12 20:41
|
Data Engineer
yuno |
Bogota
|
Remote LATAM Full Time Individual Contributor +3 years of experienceWho We AreAt Yuno we are building the payment infrastructure that enables all companies to participate in the global market. Founded by a team of seasoned experts in the payments and IT industries Yuno provides a high-performance payment orchestrator. Our technology offers companies access to leading payment capabilities allowing them to engage customers confidently and maintain global business operations with seamless payment integrations worldwide.Shape your future with Yuno!We are orchestrating the best high-performing team!If you’re a Data Engineer specialized in ETL's and you enjoy solving complex problems with code developing and you are not afraid of learning new things we are looking for you.As a SR Data Engineer you will be part of the team delivering the different parts of our production-ready product while co-designing and implementing an architecture that can scale up with the product and the company.🟣 Your challenge at Yuno • Implement any type of extraction ( Manuals mails SFTP and Apis)Build solutions for application integrations task automation and any relevant data automation using proven design patterns.• Design and build data processing pipelines for large volumes of data that are performant and scalable.• Build and maintain the infrastructure required for extraction loading transformation and storage of data from multiple data sources using custom scripted.• Collaborate with the team to develop and maintain a robust and scalable data infrastructure to support our data needs.• Implement and enforce data governance policies and best practices to ensure data quality security and compliance.• Manage and optimize data warehousing solutions for efficient storage and retrieval of data.• Develop and maintain data lake solutions for storing and managing diverse data types.• Use big data technologies and frameworks to process and analyze large datasets efficiently.• Work with distributed data systems and technologies to handle high volumes of data.• Integrate data from multiple sources.🟣 Skills you needMinimum Qualifications• Proven experience as a Data Engineer or similar role in a data-intensive environment.• Strong proficiency in Python and SQL.•Knowledge of data infrastructure design and management.• Familiarity with data governance principles and practices.• Experience with ETL processes and tools.• Proficiency in working with Data Warehouses or Data Lakes.• Familiarity with big data technologies and distributed data systems.• Ability to integrate data from multiple sources.• Knowledge of Spark is a plus.• Verbal and written English fluency.🟣 What we offer at Yuno• Competitive Compensation• Remote work - You can work from everywhere!• Home Office Bonus - We offer a one time allowance to help you create your ideal home office.• Work equipment• Stock options• Health Plan wherever you are• Flexible Days off • Language Professional and Personal growth courses ➡ ➡
|
2025-06-12 18:38
|
Data & AI Cloud Engineer—GCP (remote UK)
revolgy |
London, n a (Remote)
|
As a Data & AI Cloud Engineer at Revolgy you will be a key contributor to the design development and deployment of data and AI solutions on Google Cloud Platform (GCP). You will leverage your expertise in data engineering AI and cloud technologies to help our customers transform their data into actionable insights. This is an excellent opportunity to deepen your technical skills work on diverse projects and grow your career within a dynamic and supportive team. Location 🌍 Remote position open to candidates in the UK. Contract Type 📃 This position is open to candidates seeking employment under a standard employment agreement or engagement as an independent contractor under a business-to-business (B2B) contractual arrangement About Revolgy ☁️ Revolgy is a leading multinational company providing digital transformation services through online collaboration tools and cloud infrastructure operating on Google Cloud Platform and Amazon Web Services. We are a premier partner of Google and Amazon. We serve over 2000 customers around the globe. What Will You Do? 🔧 Contribute to the design of data solutions and assist with their implementation on GCP. Develop and deploy AI/ML models using Vertex AI applying your knowledge of machine learning techniques. Build and optimise efficient data pipelines to ensure smooth data flow and processing. Implement data quality monitoring processes to maintain data accuracy and reliability. Help automate infrastructure and deployment tasks to improve efficiency and scalability. Troubleshoot technical issues related to data pipelines AI models and cloud infrastructure. Create clear and concise technical documentation for solutions and processes. What You Bring 🎯 Demonstrated experience with Google Cloud Platform (GCP) services including BigQuery BQ ML Dataflow and Vertex AI. Working knowledge of data engineering specifically building and maintaining data pipelines. You are proficient in programming languages such as Python. You are familiar with fundamentals of AI/ML concepts and tools such as Vertex AI. You understand data governance principles and best practices. You have strong problem-solving skills and the ability to analyze and resolve technical challenges. You possess excellent communication skills and can collaborate effectively within a team. You have knowledge of DevOps concepts including CI/CD and infrastructure as code. Why Join Revolgy? 🌎 Fully remote setup & flexible remote-first culture 🤲 Contribution to pension plan 🏖️ 5 weeks of paid vacation & sick days 🏠 Budget for work equipment and energy costs 📱 Mobile phone & mobile tariff contribution 💻 Company notebook👥 Growth & learning – access to continuous training certifications and career development opportunities.🔥 Challenging environment – work on complex high-impact cloud projects that push the boundaries of innovation. ✈️ Travel to partner events company gatherings and others. Equal Opportunity Employer 🌈 Diversity and equal opportunity are important to us. We are happy about the interest of all candidates and strive to provide feedback as quickly as possible.
|
2025-06-12 18:02
|
Data & AI Cloud Engineer—GCP (remote Sweden)
revolgy |
Stockholm, n a (Remote)
|
As a Data & AI Cloud Engineer at Revolgy you will be a key contributor to the design development and deployment of data and AI solutions on Google Cloud Platform (GCP). You will leverage your expertise in data engineering AI and cloud technologies to help our customers transform their data into actionable insights. This is an excellent opportunity to deepen your technical skills work on diverse projects and grow your career within a dynamic and supportive team. Location 🌍 Remote position open to candidates in Sweden. Contract Type 📃 Engagement as an independent contractor under a business-to-business (B2B) contractual arrangement. About Revolgy ☁️ Revolgy is a leading multinational company providing digital transformation services through online collaboration tools and cloud infrastructure operating on Google Cloud Platform and Amazon Web Services. We are a premier partner of Google and Amazon. We serve over 2000 customers around the globe. What Will You Do? 🔧 Contribute to the design of data solutions and assist with their implementation on GCP. Develop and deploy AI/ML models using Vertex AI applying your knowledge of machine learning techniques. Build and optimise efficient data pipelines to ensure smooth data flow and processing. Implement data quality monitoring processes to maintain data accuracy and reliability. Help automate infrastructure and deployment tasks to improve efficiency and scalability. Troubleshoot technical issues related to data pipelines AI models and cloud infrastructure. Create clear and concise technical documentation for solutions and processes. What You Bring 🎯 Demonstrated experience with Google Cloud Platform (GCP) services including BigQuery BQ ML Dataflow and Vertex AI. Working knowledge of data engineering specifically building and maintaining data pipelines. You are proficient in programming languages such as Python. You are familiar with fundamentals of AI/ML concepts and tools such as Vertex AI. You understand data governance principles and best practices. You have strong problem-solving skills and the ability to analyze and resolve technical challenges. You possess excellent communication skills and can collaborate effectively within a team. You have knowledge of DevOps concepts including CI/CD and infrastructure as code. Why Join Revolgy? 🌎 Fully remote setup & flexible remote-first culture 🚀 Cutting-edge technology stack – work with AI augmentation automation and advanced cloud solutions.👥 Growth & learning – access to continuous training certifications and career development opportunities.🔥 Challenging environment – work on complex high-impact cloud projects that push the boundaries of innovation. ✈️ Travel to partner events company gatherings and others. Equal Opportunity Employer 🌈 Diversity and equal opportunity are important to us. We are happy about the interest of all candidates and strive to provide feedback as quickly as possible.
|
2025-06-12 18:02
|
🔥 +1075 more results. Unlock: sign-up / login.
Login & search by job title, a different location + other details.
Powerful custom searches are available once you login.