Job Title | Location | Description | Posted** |
---|---|---|---|
Software Engineer - Ruby/React
New Relic |
Portland, OR
|
We are a global team of innovators and pioneers dedicated to shaping the future of observability. At New Relic we build an intelligent platform that empowers companies to thrive in an AI-first world by giving them unparalleled insight into their complex systems. As we continue to expand our global footprint we're looking for passionate people to join our mission. If you're ready to help the world's best companies optimize their digital applications we invite you to explore a career with us! Your opportunity New Relic’s Product & Subscription Services (PASS) Team is seeking an experienced software engineer who can contribute to the team’s mission to accelerate time to market for new products. You’ll aid that mission by: Providing self-serve components and services that feature teams use to build and sell their products Improving the product ecosystem to reduce the cost and manual toil needed to bring new products to market. We are a remote-first US Pacific-based team focuses on delivering incremental value releasing multiple times per day and using Agile techniques tools and methodologies. Our company provides businesses with a state-of-the-art observability platform that leverages advanced technologies to deliver real-time insights into the performance of software applications and infrastructure. Our products lead at the forefront of the industry by enabling organizations to monitor analyze and optimize their systems to achieve improvements in reliability performance and user experience. If all of that excites you as much as it excites us we would love to hear from you! What you'll do Develop reliable and stable services using Ruby on Rails and Java. Create appealing user interfaces for account configuration and product provisioning using React. Work with Kafka to provide streams of data for other teams to ingest. Use all of New Relic’s cool product features to monitor and troubleshoot your own services. Participate in an on-call rotation to support our services we bake stability into everything we do to ensure a healthy work-life balance. Work with some of the smartest nicest people you'll ever meet. This role requires 2+ years of experience developing customer-facing Ruby On Rails applications in a team environment. Experience with front-end frameworks (we use React). Experience with REST APIs. A collaborative work style that includes colleagues in important decisions and leads to shared code ownership. Strong problem-solving skills and an ability to advocate your point of view without ego. Capable written and verbal communication skills within both our remote-first team and our international company. Comfort and willingness to give and receive feedback accountability and candor in a blameless work culture. Bonus points if you have Familiarity with Java JavaScript React and/or GraphQL. Experience deploying and working with distributed systems technologies. Experience using Apache Kafka or an alternative messaging system. Experience with unit and integration test automation that adds to a development team's efficiency and reliability as well as a passion for applying these techniques to your projects. Please note that visa sponsorship is not available for this position. #LI-GK1 #LI-Remote The pay range below represents a reasonable estimate of the salary for the listed position. This role is eligible for a corporate bonus plan. Pay within this range varies by work location and may also depend on job-related factors such as an applicant’s skills qualifications and experience. New Relic provides a variety of benefits for this role including healthcare dental vision parental leave and planning and mental health benefits a 401(k) plan and match flex time-off 11 paid holidays volunteer time-off and other competitive benefits designed to improve the lives of our employees. Estimated Base Pay Range $106000 - $133000 USD Fostering a diverse welcoming and inclusive environment is important to us. We work hard to make everyone feel comfortable bringing their best most authentic selves to work every day. We celebrate our talented Relics’ different backgrounds and abilities and recognize the different paths they took to reach us – including nontraditional ones. Their experiences and perspectives inspire us to make our products and company the best they can be. We’re looking for people who feel connected to our mission and values not just candidates who check off all the boxes. If you require a reasonable accommodation to complete any part of the application or recruiting process please reach out to resume@newrelic.com. We believe in empowering all Relics to achieve professional and business success through a flexible workforce model. This model allows us to work in a variety of workplaces that best support our success including fully office-based fully remote or hybrid. Our hiring process In compliance with applicable law all persons hired will be required to verify identity and eligibility to work and to complete employment eligibility verification. Note: Our stewardship of the data of thousands of customers means that a criminal background check is required to join New Relic. We will consider qualified applicants with arrest and conviction records based on individual circumstances and in accordance with applicable law including but not limited to the San Francisco Fair Chance Ordinance. Headhunters and recruitment agencies may not submit resumes/CVs through this website or directly to managers. New Relic does not accept unsolicited headhunter and agency resumes and will not pay fees to any third-party agency or company that does not have a signed agreement with New Relic. New Relic develops and distributes encryption software and technology that complies with U.S. export controls and licensing requirements. Certain New Relic roles require candidates to pass an export compliance assessment as a condition of employment in any global location. If relevant we will provide more information later in the application process. Candidates are evaluated based on qualifications regardless of race religion ethnicity national origin sex sexual orientation gender expression or identity age disability neurodiversity veteran or marital status political viewpoint or other legally protected characteristics. Review our Applicant Privacy Notice at https://newrelic.com/termsandconditions/applicant-privacy-policy
|
|
Resident Solutions Architect (RSA)
TISTA Science and Technology Corporation |
Remote
|
Overview: TISTA is seeking a highly skilled Resident Solutions Architect (RSA) to provide advanced technical expertise and guidance in support of our Data and Artificial Intelligence (AI) initiatives. The RSA will play a critical role in shaping and executing modern cloud-native architectures ensuring secure scalable and innovative solutions that align with mission and enterprise requirements. TISTA associates enjoy above Industry Healthcare Benefits Remote Working Options Paid Time Off Training/Certification opportunities Healthcare Savings Account & Flexible Savings Account Paid Life Insurance Short-term & Long-term Disability 401K Match Tuition Reimbursement Employee Assistance Program Paid Holidays Military Leave and much more! Responsibilities: Design and implement end-to-end architectures on Databricks Spark and Delta Lake supporting large-scale ingestion ELT and advanced analytics. Architect medallion (Bronze Silver Gold) data lakehouse patterns with batch and real-time pipelines. Enable ML/AI capabilities using MLflow TensorFlow PyTorch and other libraries within collaborative workspaces. Define and enforce security standards including RBAC SCIM SAML SSO and secrets management. Ensure compliance with security requirements while deploying in customer-controlled VPCs. Integrate BI tools (Tableau Power BI etc.) and ensure interoperability across ecosystems. Provide technical leadership documentation and mentorship to engineering and data science teams. Partner with stakeholders to translate requirements into scalable and secure solutions. Qualifications: Bachelor’s degree in Computer Science Engineering or related field. 8+ years of experience in solutions architecture data engineering or related roles. Strong expertise in Databricks Apache Spark and Delta Lake. Proven experience designing secure cloud-native architectures (AWS Azure or GCP). Hands-on experience with SQL Python R or Scala in data/analytics contexts. Deep understanding of ACID transactions data versioning and point-in-time querying Practical experience with AI/ML model development deployment or integration. Experience with data governance security controls and RBAC using Unity Catalog. Strong communication skills with the ability to engage both technical and business stakeholders. Preferred Qualifications: Master’s degree in a technical discipline. Or Bachelors with 10+ years of experience Experience supporting large-scale federal or healthcare data platforms. Familiarity with ML/AI tools (MLflow TensorFlow PyTorch XGBoost). Knowledge of CI/CD pipelines GitHub integration and DevOps best practices. Experience with BI and visualization tools (Power BI Tableau MicroStrategy Qlik). Databricks certification (e.g. Databricks Certified Solutions Architect Databricks Certified Data Engineer). Relevant cloud certifications (AWS Solutions Architect Azure Solutions Architect Databricks Certified Architect). Location: Remote (U.S. Based) Clearance Requirement: Must be a U.S. Citizen or Green Card holder and able to obtain a Public Trust clearance Employment Type: Full-time/Contract Pay Range: The pay for this position ranges from $140730 to $180500. The actual salary offer will carefully consider a wide range of factors including your skills qualifications experience and location. Also certain positions are eligible for additional forms of compensation such as bonuses. TISTA associates are eligible to participate in our comprehensive benefits plan! More information can be found here: https://tistatech.com/working-at-tista/
|
|
Senior Applications Support Engineer (Mexico-Remote)
MATRIXX Software, Inc. |
Remote
|
Applicants must be based in Mexico - Central and East Coast Time Zones preferred To support all aspects of MATRIXX Software Applications with Tier 1 Telco customers globally and work collaboratively with MATRIXX’s engineering and professional services teams with the goal of becoming a MATRIXX software certified product expert. To take ownership of complex customer and partner reported issues and to assist with project delivery in the deployment and operational handover phases. Key responsibilities Collaborate with internal and external stakeholders throughout the customer and partner lifecycle. Provide remote support under stringent SLA to MATRIXX customers and partners in all aspects of the MATRIXX platform. Assist in deployment and operational support activities. Participate in out of hours on-call rotation. Support customer relationship with the goal of establishing and maintaining a professional ‘trusted advisor’ position. Identify communicate and drive improvement initiatives for the processes and tools. Monitor mobile technology and industry developments to ensure your knowledge remains current. Provide coaching and mentoring to more junior colleagues. Proven experience working in a Support or Services role preferably in the Telco environment. Proven experience working with real-time network technology (Intelligent Networks Online/Converged Charging Systems Mediation Policy Control) Evidence of technical expertise and significant experience in the following Telco skills: Unix and Linux Applications and Systems (OCS/CCS IN Billing PCRF) Networking protocols (DIAMETER 5G SS7 MAP INAP CAP) A bachelor’s degree or equivalent technical / science-based qualification Good communication skills covering both technical and non-technical aspects. International working experience Preferred Qualifications Additional skills in one or more of the following: IT and Web Service Protocols (HTTP/ReST) Scripting languages (Python Perl BASH) OS and Programming (Red Hat Enterprise Linux Java C++) Monitoring Applications and Visualization (Prometheus/Grafana/PRTG) Monitoring protocols (SNMP JMX) IP networking (routing/switching load balancing resilience concepts) Cloud/Virtualisation (VMware Azure AWS Google Cloud Platform Kubernetes/Docker Openshift KVM) Apache HBase MongoDB Redpanda Apache Kafka Apache ActiveMQ Competencies: Resolver - Identifies the problem and providing resolution. Focused - Takes direction follows through and makes the corrections necessary to stay on track prioritizing before reacting. Collaborator - Works hard with others to achieve the team goals and objectives. Mentor - Recognizing and cultivating the potential in others identifying incremental improvements and derive satisfaction from evidence of progress. Influencer - Connecting with someone new and winning them over. Achiever - Taking satisfaction from being busy and productive possessing a great deal of stamina and working hard. Adaptability - Embraces the pivot problem-solver doing what needs to be done. Boldness - Creates alternative ways to proceed quickly able to spot the relevant patterns and issues of any given scenario. Forward Thinker - Willing to question the status quo. Unafraid to challenge convention looking for innovative solutions and outcomes that move the needle. Integrity - Committed to stable values such as honesty and loyalty taking psychological ownership of what they say they will do. Process-oriented - Equality and treatment of all people the same with stable routines and clear rules and procedures that everyone can follow. Results driven - Independent desire to make a big impact prioritizing projects based on how much influence they will have on their organization or people around them. Values Driven - Having unchanging core values that defines one's purpose. Compensation: The US pay range for this position is $125000.00 – $149000 per year however base pay offered may vary depending on job-related knowledge skills and experience. Base pay information is based on US market only and will vary by global location. Variable pay in the form of bonus or commission may be provided as part of the compensation package in addition to a full range of medical and/or other benefits dependent on the location of the position offered.
|
|
AI/NLP Engineer
LeoTech |
Irvine, CA
|
At LeoTech we are passionate about building software that solves real-world problems in the Public Safety sector. Our software has been used to help the fight against continuing criminal enterprises drug trafficking organizations identifying financial fraud disrupting sex and human trafficking rings and focusing on mental health matters to name a few. As an AI/NLP Engineer on our Data Science team you will be at the forefront of leveraging Large Language Models (LLMs) and cutting-edge AI techniques to create transformative solutions for public safety and intelligence workflows. You will apply your expertise in LLMs Retrieval-Augmented Generation (RAG) semantic search Agentic AI GraphRAG and other advanced AI solutions to develop enhance and deploy robust features that enable real-time decision-making for our end users. You will work closely with product engineering and data science teams to translate real-world problems into scalable production-grade solutions. This is an individual contributor (IC) role that emphasizes technical depth experimentation and hands-on engineering. You will participate in all phases of the AI solution lifecycle from architecture and design through prototyping implementation evaluation productionization and continuous improvement. ### Core Responsibilities Design build and optimize AI-powered solutions using LLMs RAG pipelines semantic search GraphRAG and Agentic AI architectures. Implement and experiment with the latest advancements in large-scale language modeling including prompt engineering model fine-tuning evaluation and monitoring. Collaborate with product backend and data engineering teams to define requirements break down complex problems and deliver high-impact features aligned with business objectives. Inform robust data ingestion and retrieval pipelines that power real-time and batch AI applications using open-source and proprietary tools. Integrate external data sources (e.g. knowledge graphs internal databases third-party APIs) to enhance the context-awareness and capabilities of LLM-based workflows. Evaluate and implement best practices for prompt design model alignment safety and guardrails for responsible AI deployment. Stay on top of emerging AI research and contribute to internal knowledge-sharing tech talks and proof-of-concept projects. Author clean well-documented and testable code participate in peer code reviews and engineering design discussions. Proactively identify bottlenecks and propose solutions to improve system scalability efficiency and reliability. ### What We Value Bachelor's or Master's degree in Computer Science Artificial Intelligence Data Science or a related field. 5+ years of hands-on experience in applied AI NLP or ML engineering (with at least 2 years working directly with LLMs RAG semantic search and Agentic AI). Deep familiarity with LLMs (e.g. OpenAI Claude Gemini) prompt engineering and responsible deployment in production settings. Experience designing building and optimizing RAG pipelines semantic search vector databases (e.g. ElasticSearch Pinecone) and Agentic or multi-agent AI workflows in in large scale production setup. Exposure to MCP and A2A protocol is a plus. Exposure to GraphRAG or graph-based knowledge retrieval techniques is a strong plus. Strong proficiency with modern ML frameworks and libraries (e.g. LangChain LlamaIndex PyTorch HuggingFace Transformers). Ability to design APIs and scalable backend services with hands-on experience in Python. Experience building deploying and monitoring AI/ML workloads in cloud environments (AWS Azure) using services like AWS SageMaker AWS Bedrock AzureAI etc. Experience with tools to load balance different LLMs providers is a plus. Familiarity with MLOps practices CI/CD for AI model monitoring data versioning and continuous integration. Demonstrated ability to work with large complex datasets perform data cleaning feature engineering and develop scalable data pipelines. Excellent problem-solving collaboration and communication skills able to work effectively across remote and distributed teams. Proven record of shipping robust high-impact AI solutions ideally in fast-paced or regulated environments. ### Technologies We Use Cloud & AI Platforms: AWS (Bedrock SageMaker Lambda) AzureAI Pinecone ElasticCloud Imply Polaris. LLMs & NLP: HuggingFace OpenAI API LangChain LlamaIndex Cohere Anthropic. Backend: Python (primary) Elixir (other teams). Data Infrastructure: ElasticSearch Pinecone Weaviate Apache Kafka Airflow. Frontend: TypeScript React. DevOps & Automation: Terraform EKS GitHub Actions CodePipeline ArgoCD. Monitoring & Metrics: Grafana (metrics dashboards alerting) Langfuse (Agentic AI observability prompt management) Testing: Playwright for end-to-end test automation. Other Tools: Mix of open-source and proprietary frameworks tailored to complex real-world problems. ### What You Can Expect Enjoy great team camaraderie whether at our Irvine office or working remotely. Thrive on the fast pace and challenging problems to solve. Modern technologies and tools. Continuous learning environment. Opportunity to communicate and work with people of all technical levels in a team environment. Grow as you are given feedback and incorporate it into your work. Be part of a self-managing team that enjoys support and direction when required. 3 weeks of paid vacation – out the gate!! Competitive Salary. Generous medical dental and vision plans. Sick and paid holidays are offered. Work with talented and collaborative co-workers. LeoTech is an equal opportunity employer and does not discriminate on the basis of any legally protected status.
|
|
Resident Solutions Architect (RSA)
TISTA Science and Technology Corporation |
Remote
|
Overview: TISTA is seeking a highly skilled Resident Solutions Architect (RSA) to provide advanced technical expertise and guidance in support of our Data and Artificial Intelligence (AI) initiatives. The RSA will play a critical role in shaping and executing modern cloud-native architectures ensuring secure scalable and innovative solutions that align with mission and enterprise requirements. TISTA associates enjoy above Industry Healthcare Benefits Remote Working Options Paid Time Off Training/Certification opportunities Healthcare Savings Account & Flexible Savings Account Paid Life Insurance Short-term & Long-term Disability 401K Match Tuition Reimbursement Employee Assistance Program Paid Holidays Military Leave and much more! Responsibilities: Design and implement end-to-end architectures on Databricks Spark and Delta Lake supporting large-scale ingestion ELT and advanced analytics. Architect medallion (Bronze Silver Gold) data lakehouse patterns with batch and real-time pipelines. Enable ML/AI capabilities using MLflow TensorFlow PyTorch and other libraries within collaborative workspaces. Define and enforce security standards including RBAC SCIM SAML SSO and secrets management. Ensure compliance with security requirements while deploying in customer-controlled VPCs. Integrate BI tools (Tableau Power BI etc.) and ensure interoperability across ecosystems. Provide technical leadership documentation and mentorship to engineering and data science teams. Partner with stakeholders to translate requirements into scalable and secure solutions. Qualifications: Bachelor’s degree in Computer Science Engineering or related field. 8+ years of experience in solutions architecture data engineering or related roles. Strong expertise in Databricks Apache Spark and Delta Lake. Proven experience designing secure cloud-native architectures (AWS Azure or GCP). Hands-on experience with SQL Python R or Scala in data/analytics contexts. Deep understanding of ACID transactions data versioning and point-in-time querying Practical experience with AI/ML model development deployment or integration. Experience with data governance security controls and RBAC using Unity Catalog. Strong communication skills with the ability to engage both technical and business stakeholders. Preferred Qualifications: Master’s degree in a technical discipline. Or Bachelors with 10+ years of experience Experience supporting large-scale federal or healthcare data platforms. Familiarity with ML/AI tools (MLflow TensorFlow PyTorch XGBoost). Knowledge of CI/CD pipelines GitHub integration and DevOps best practices. Experience with BI and visualization tools (Power BI Tableau MicroStrategy Qlik). Databricks certification (e.g. Databricks Certified Solutions Architect Databricks Certified Data Engineer). Relevant cloud certifications (AWS Solutions Architect Azure Solutions Architect Databricks Certified Architect). Location: Remote (U.S. Based) Clearance Requirement: Must be a U.S. Citizen or Green Card holder and able to obtain a Public Trust clearance Employment Type: Full-time/Contract Pay Range: The pay for this position ranges from $140730 to $180500. The actual salary offer will carefully consider a wide range of factors including your skills qualifications experience and location. Also certain positions are eligible for additional forms of compensation such as bonuses. TISTA associates are eligible to participate in our comprehensive benefits plan! More information can be found here: https://tistatech.com/working-at-tista/
|
|
Senior Database Engineer
goodsservices |
United States-Remote
|
Goods & Services is a product design and engineering company. We solve mission-critical challenges for some of the world’s largest enterprises with deep expertise in highly regulated industries—including life sciences and financial services. Our design-led approach allows us to apply cutting-edge capabilities in AI Data and Hardware Engineering to companies of any size. Headquartered in the United States we operate regional development centers in Mexico and the United Kingdom. This global footprint—anchored by our nearshore model—enables us to deliver at scale with the speed efficiency and cultural alignment our clients expect. About the job Goods & Services is looking for a Senior Database Engineer who has a strong background in financial industry and hands-on experience leveraging AI powered tools – including dbt Copilot Cortex AI GPT and other Generative AI technologies to join our dynamic and growing data engineering team. Core Function: Design build and maintain data infrastructure and pipelines to support large-scale data processing and analysis. Enable efficient data collection storage and transformation for business applications and end-users. Collaborate with cross-functional teams to deliver reliable and strategic data services. Ensure data accuracy scalability and performance across platforms and solutions. Leverage AWS and/or Snowflake environments to manage cloud-based data systems. What you’ll do: Design develop and maintain scalable and reliable data pipelines to ingest transform and load structured semi-structured and unstructured data from various sources into our data lake and warehouse environments. Implement data integration solutions to consolidate data from disparate sources including databases API’s streaming platforms and 3rd party services. Optimize data processing workflows for performance efficiency and scalability using distributed computing or parallel processing frameworks such as FiveTran dbt Snowpark Snowpipe etc… Collaborate cross-functionally with IT & business stakeholders to understand data requirements define data models and develop solutions to support data services reporting and Software Development. Partner with data IT and business teams to improve design and building of metrics to enhance our analytic capabilities. Implement data quality checks data validation processes and error handling mechanisms to ensure the accuracy completeness and reliability of data across all stages of the data lifecycle. Design and maintain data schemas data dictionaries and metadata repositories to support the documentation of data lineage definitions and dependencies. Apply best practices for AWS and Snowflake architectures data pipelines and data models. Support the development and maintenance of data governance policies standards and best practices to ensure compliance with data privacy regulations and industry standards. Monitor troubleshoot and optimize the performance and availability of data systems and infrastructure using monitoring and logging tools such as Prometheus. Stay current with emerging technologies tools and trends in data engineering cloud architecture and cloud computing to evaluate their potential impact and relevance to our data platforms. What you’ll need: Bachelor’s or master’s degree in computer science Engineering or a related field. 10+ years of experience in data engineering data pipelines and data services. Integrate and operate AI-powered tools such as dbt Copilot Cortex AI GPT and similar Gen AI solutions to automate code generation data quality checks and documentation within data engineering workflows. Strong proficiency in programming languages such as Python SQL Spark or Java with experience in data manipulation transformation and analysis. Hands-on experience with cloud-based data platforms and services such as AWS and Snowflake. Experience with distributed computing frameworks such as Apache Spark Kafka etc.. Proficiency in database systems data warehousing data patterns/architectures and SQL query optimizations. Familiarity with containerization and orchestration technologies such as Docker or Kubernetes. Excellent problem-solving skills attention to detail and ability to work effectively in a fast-paced and collaborative environment. Strong communication interpersonal and teamwork skills with the ability to interact with stakeholders at all levels of the data team. Why you’ll love working here: We believe in making things better by making better things. Remote work: We’re a global design and technology company with presence in Mexico US and London. Tools: We provide you with a laptop ready to add value with us! Full Time Job: Work from Monday through Friday. Be Rich Program: We live our mission by adding value to the communities. Professional Development: We offer regular in-house Centers of Excellence (CoE) and Learning Sessions for professional growth. Internal Referral Program: Refer a friend and if hired participate in a raffle to win amazing prizes! Other benefits: As made available as offered by employer.
|
|
Senior Data Engineer - [J209]
Skm Group |
Remote United Kingdom
|
We are seeking a Senior Data Engineer with to design build and optimize scalable data pipelines and platforms. You will play a key role in developing cloud-native data solutions using Databricks Spark and Azure services ensuring high performance reliability and security of enterprise data systems. Qualifications Advanced Python skills with strong software engineering fundamentals. Proven expertise in Apache Spark and SQL for large-scale data engineering and analytics. Hands-on experience with Databricks including ingestion transformation and its latest features. Familiarity with Delta Lake and modern data warehousing concepts. Proficiency with Azure services (must-have): Azure Data Factory Cosmos DB Functions Event Store ADLS Key Vault. Experience with cloud environments (Azure required AWS/GCP nice-to-have). Strong knowledge of RDBMS and NoSQL databases with ability to choose the right technology per use case. Experience applying Data as Code principles: version control frequent commits unit tests CI/CD packaging. Proficiency with Docker (required) Kubernetes is a plus. Familiarity with reporting/visualization tools (Power BI a plus). Exposure to ML/AI integration into data platforms is a plus. Solid understanding of the software development lifecycle. Excellent communication interpersonal and presentation skills. What do we offer you? Attractive salary Large freedom and real influence No unhealthy competition team approach to meeting challenges Remote-first flexible working culture Company apartments in cool cities across Europe: work and enjoy a memorable getaway About Us We are a software house with 18 years of experience and a global portfolio of projects. We help businesses modernize scale and innovate through custom software solutions — always with a focus on flexibility and quality. Our team embraces unconventional ideas and new technologies delivering solutions that drive real impact. If you value professionalism creativity and a strong engineering culture you'll feel at home here. Job Type: Full-time Pay: £45000.00-£102000.00 per year Experience: Data Engineer: 4 years (required) Work Location: Remote
|
|
Contract- Senior Support Engineer - Belgium (Remote)
MATRIXX Software |
Remote United States
|
Purpose of the role To support all aspects of MATRIXX Software Applications with Tier 1 Telco customers globally and work collaboratively with MATRIXX’s engineering and professional services teams with the goal of becoming a MATRIXX software certified product expert. To take ownership of complex customer and partner reported issues and to assist with project delivery in the deployment and operational handover phases. Key responsibilities Collaborate with internal and external stakeholders throughout the customer and partner lifecycle. Provide remote support under stringent SLA to MATRIXX customers and partners in all aspects of the MATRIXX platform. Assist in deployment and operational support activities. Participate in out of hours on-call rota. Support customer relationship with the goal of establishing and maintaining a highly professional and ‘trusted advisor’ position. Identify communicate and drive improvement initiatives for the processes and tools. Monitor mobile technology and industry developments to ensure your knowledge remains current. Provide coaching and mentoring to more junior colleagues. Required skills and experience. Proven experience working in a Support or Services role preferably in the Telco environment. Proven experience working with real-time network technology (Intelligent Networks Online/Converged Charging Systems Mediation Policy Control) Evidence of technical expertise and significant experience in the following OS/Telco skills: Unix/Linux Applications and Systems (OCS/CCS IN Billing PCRF) Networking protocols (DIAMETER 5G SS7 MAP INAP CAP) A bachelor’s degree or equivalent technical / science-based qualification Good communication skills covering both technical and non-technical aspects. International working experience Desirable skills Additional skills in one or more of the following: IT and Web Service Protocols (HTTP/ReST) Scripting languages (Python Perl BASH) OS and Programming (Red Hat Enterprise Linux Java C++) Monitoring Applications and Visualization (Prometheus/Grafana/PRTG) Monitoring protocols (SNMP JMX) IP networking (routing/switching load balancing resilience concepts) Cloud/Virtualisation (VMware Azure AWS Google Cloud Platform Kubernetes/Docker Openshift KVM) Apache HBase MongoDB Redpanda Apache Kafka Apache ActiveMQ Competencies: Resolver – Identifies the problem and providing resolution. Focused – Takes direction follows through and makes the corrections necessary to stay on track prioritizing before reacting. Collaborator – Works hard with others to achieve the team goals and objectives. Mentor – Recognizing and cultivating the potential in others identifying incremental improvements and derive satisfaction from evidence of progress. Influencer – Connecting with someone new and winning them over. Achiever – Taking satisfaction from being busy and productive possessing a great deal of stamina and working hard. Adaptability – Embraces the pivot problem-solver doing what needs to be done. Boldness – Creates alternative ways to proceed quickly able to spot the relevant patterns and issues of any given scenario. Forward Thinker – Willing to question the status quo. Unafraid to challenge convention looking for innovative solutions and outcomes that move the needle. Integrity – Committed to stable values such as honesty and loyalty taking psychological ownership of what they say they will do. Process-oriented – Equality and treatment of all people the same with stable routines and clear rules and procedures that everyone can follow. Results driven – Independent desire to make a big impact prioritizing projects based on how much influence they will have on their organization or people around them. Values Driven – Having unchanging core values that defines one's purpose.
|
|
Senior Applications Support Engineer (Mexico-Remote)
MATRIXX Software |
Remote United States
|
Applicants must be based in Mexico - Central and East Coast Time Zones preferred To support all aspects of MATRIXX Software Applications with Tier 1 Telco customers globally and work collaboratively with MATRIXX’s engineering and professional services teams with the goal of becoming a MATRIXX software certified product expert. To take ownership of complex customer and partner reported issues and to assist with project delivery in the deployment and operational handover phases. Key responsibilities Collaborate with internal and external stakeholders throughout the customer and partner lifecycle. Provide remote support under stringent SLA to MATRIXX customers and partners in all aspects of the MATRIXX platform. Assist in deployment and operational support activities. Participate in out of hours on-call rotation. Support customer relationship with the goal of establishing and maintaining a professional ‘trusted advisor’ position. Identify communicate and drive improvement initiatives for the processes and tools. Monitor mobile technology and industry developments to ensure your knowledge remains current. Provide coaching and mentoring to more junior colleagues. Proven experience working in a Support or Services role preferably in the Telco environment. Proven experience working with real-time network technology (Intelligent Networks Online/Converged Charging Systems Mediation Policy Control) Evidence of technical expertise and significant experience in the following Telco skills: Unix and Linux Applications and Systems (OCS/CCS IN Billing PCRF) Networking protocols (DIAMETER 5G SS7 MAP INAP CAP) A bachelor’s degree or equivalent technical / science-based qualification Good communication skills covering both technical and non-technical aspects. International working experience Preferred Qualifications Additional skills in one or more of the following: IT and Web Service Protocols (HTTP/ReST) Scripting languages (Python Perl BASH) OS and Programming (Red Hat Enterprise Linux Java C++) Monitoring Applications and Visualization (Prometheus/Grafana/PRTG) Monitoring protocols (SNMP JMX) IP networking (routing/switching load balancing resilience concepts) Cloud/Virtualisation (VMware Azure AWS Google Cloud Platform Kubernetes/Docker Openshift KVM) Apache HBase MongoDB Redpanda Apache Kafka Apache ActiveMQ Competencies: Resolver - Identifies the problem and providing resolution. Focused - Takes direction follows through and makes the corrections necessary to stay on track prioritizing before reacting. Collaborator - Works hard with others to achieve the team goals and objectives. Mentor - Recognizing and cultivating the potential in others identifying incremental improvements and derive satisfaction from evidence of progress. Influencer - Connecting with someone new and winning them over. Achiever - Taking satisfaction from being busy and productive possessing a great deal of stamina and working hard. Adaptability - Embraces the pivot problem-solver doing what needs to be done. Boldness - Creates alternative ways to proceed quickly able to spot the relevant patterns and issues of any given scenario. Forward Thinker - Willing to question the status quo. Unafraid to challenge convention looking for innovative solutions and outcomes that move the needle. Integrity - Committed to stable values such as honesty and loyalty taking psychological ownership of what they say they will do. Process-oriented - Equality and treatment of all people the same with stable routines and clear rules and procedures that everyone can follow. Results driven - Independent desire to make a big impact prioritizing projects based on how much influence they will have on their organization or people around them. Values Driven - Having unchanging core values that defines one's purpose. Compensation: The US pay range for this position is $125000.00 – $149000 per year however base pay offered may vary depending on job-related knowledge skills and experience. Base pay information is based on US market only and will vary by global location. Variable pay in the form of bonus or commission may be provided as part of the compensation package in addition to a full range of medical and/or other benefits dependent on the location of the position offered.
|
|
Resident Solutions Architect (RSA)
TISTA Science and Technology Corporation |
Remote United States
|
Overview: TISTA is seeking a highly skilled Resident Solutions Architect (RSA) to provide advanced technical expertise and guidance in support of our Data and Artificial Intelligence (AI) initiatives. The RSA will play a critical role in shaping and executing modern cloud-native architectures ensuring secure scalable and innovative solutions that align with mission and enterprise requirements. TISTA associates enjoy above Industry Healthcare Benefits Remote Working Options Paid Time Off Training/Certification opportunities Healthcare Savings Account & Flexible Savings Account Paid Life Insurance Short-term & Long-term Disability 401K Match Tuition Reimbursement Employee Assistance Program Paid Holidays Military Leave and much more! Responsibilities: Design and implement end-to-end architectures on Databricks Spark and Delta Lake supporting large-scale ingestion ELT and advanced analytics. Architect medallion (Bronze Silver Gold) data lakehouse patterns with batch and real-time pipelines. Enable ML/AI capabilities using MLflow TensorFlow PyTorch and other libraries within collaborative workspaces. Define and enforce security standards including RBAC SCIM SAML SSO and secrets management. Ensure compliance with security requirements while deploying in customer-controlled VPCs. Integrate BI tools (Tableau Power BI etc.) and ensure interoperability across ecosystems. Provide technical leadership documentation and mentorship to engineering and data science teams. Partner with stakeholders to translate requirements into scalable and secure solutions. Qualifications: Bachelor’s degree in Computer Science Engineering or related field. 8+ years of experience in solutions architecture data engineering or related roles. Strong expertise in Databricks Apache Spark and Delta Lake. Proven experience designing secure cloud-native architectures (AWS Azure or GCP). Hands-on experience with SQL Python R or Scala in data/analytics contexts. Deep understanding of ACID transactions data versioning and point-in-time querying Practical experience with AI/ML model development deployment or integration. Experience with data governance security controls and RBAC using Unity Catalog. Strong communication skills with the ability to engage both technical and business stakeholders. Preferred Qualifications: Master’s degree in a technical discipline. Or Bachelors with 10+ years of experience Experience supporting large-scale federal or healthcare data platforms. Familiarity with ML/AI tools (MLflow TensorFlow PyTorch XGBoost). Knowledge of CI/CD pipelines GitHub integration and DevOps best practices. Experience with BI and visualization tools (Power BI Tableau MicroStrategy Qlik). Databricks certification (e.g. Databricks Certified Solutions Architect Databricks Certified Data Engineer). Relevant cloud certifications (AWS Solutions Architect Azure Solutions Architect Databricks Certified Architect). Location: Remote (U.S. Based) Clearance Requirement: Must be a U.S. Citizen or Green Card holder and able to obtain a Public Trust clearance Employment Type: Full-time/Contract Pay Range: The pay for this position ranges from $140730 to $180500. The actual salary offer will carefully consider a wide range of factors including your skills qualifications experience and location. Also certain positions are eligible for additional forms of compensation such as bonuses. TISTA associates are eligible to participate in our comprehensive benefits plan! More information can be found here: https://tistatech.com/working-at-tista/
|
* unlock: sign-up for free / login and use the searches from your home page
** job listings updated in real time 🔥
Login & search by other job titles, a specific location or any keyword.
Powerful custom searches are available once you login.