Colab Python Compiler Jobs in Usa
2,075 positions found — Page 8
OZ – Databricks Architect/ Senior Data Engineer
Note: Only applications from U.S. citizens or lawful permanent residents (Green Card Holders) will be considered.
We believe work should be innately rewarding and a team-building venture. Working with our teammates and clients should be an enjoyable journey where we can learn, grow as professionals, and achieve amazing results. Our core values revolve around this philosophy. We are relentlessly committed to helping our clients achieve their business goals, leapfrog the competition, and become leaders in their industry. What drives us forward is the culture of creativity combined with a disciplined approach, passion for learning & innovation, and a ‘can-do’ attitude!
What We're Looking For:
We are seeking a highly experienced Databricks professional with deep expertise in data engineering, distributed computing, and cloud-based data platforms. The ideal candidate is both an architect and a hands-on engineer who can design scalable data solutions while actively contributing to development, optimization, and deployment.
This role requires strong technical leadership, a deep understanding of modern data architectures, and the ability to implement best practices in DataOps, performance optimization, and data governance.
Experience with modern AI/GenAI-enabled data platforms and real-time data processing environments is highly desirable.
Position Overview:
The Databricks Senior Data Engineer will play a critical role in designing, implementing, and optimizing enterprise-scale data platforms using the Databricks Lakehouse architecture. This role combines architecture leadership with hands-on engineering, focusing on building scalable, secure, and high-performance data pipelines and platforms. The ideal candidate will establish coding standards, define data architecture frameworks such as the Medallion Architecture, and guide the end-to-end development lifecycle of modern data solutions.
This individual will collaborate with cross-functional stakeholders, including data engineers, BI developers, analysts, and business leaders, to deliver robust data platforms that enable advanced analytics, reporting, and AI-driven decision-making.
Key Responsibilities:
- Architecture & Design: Architect and design scalable, reliable data platforms and complex ETL/ELT and streaming workflows for the Databricks Lakehouse Platform (Delta Lake, Spark).
- Hands-On Development: Write, test, and optimize code in Python, PySpark, and SQL for data ingestion, transformation, and processing.
- DataOps & Automation: Implement CI/CD, monitoring, and automation (e.g., with Azure DevOps, DBX) for data pipelines.
- Stakeholder Collaboration: Work with BI developers, analysts, and business users to define requirements and deliver data-driven solutions.
- Performance Optimization: Tune delta tables, Spark jobs, and SQL queries for maximum efficiency and scalability.
- GenAI Applications Development: It is a big plus to have experience in GenAI application development
Requirements:
- 8+ years of experience in data engineering, with strong hands-on expertise in Databricks and Apache Spark.
- Proven experience designing and implementing scalable ETL/ELT pipelines in cloud environments.
- Strong programming skills in Python and SQL; experience with PySpark required.
- Hands-on experience with Databricks Lakehouse, Delta Lake, and distributed data processing.
- Experience working with cloud platforms such as Microsoft Azure, AWS, or GCP (Azure preferred).
- Experience with CI/CD pipelines, Git, and DevOps practices for data engineering.
- Strong understanding of data architecture, data modeling, and performance optimization.
- Experience working with cross-functional teams to deliver enterprise data solutions.
- Tackles complex data challenges, ensuring data quality and reliable delivery.
Qualifications:
- Bachelor’s degree in Computer Science, Information Technology, Engineering, or a related field.
- Experience designing enterprise-scale data platforms and modern data architectures.
- Experience with data integration tools such as Azure Data Factory or similar platforms.
- Familiarity with cloud data warehouses such as Databricks, Snowflake, or Azure Fabric.
- Experience supporting analytics, reporting, or AI/ML workloads is highly desirable.
- Databricks, Azure, or cloud certifications are preferred.
- Strong problem-solving, communication, and technical leadership skills.
Technical Proficiency in:
- Databricks, Apache Spark, PySpark, Delta Lake
- Python, SQL, Scala (preferred)
- Cloud platforms: Azure (preferred), AWS, or GCP
- Azure Data Factory, Kafka, and modern data integration tools
- Data warehousing: Databricks, Snowflake, or Azure Fabric
- DevOps tools: Git, Azure DevOps, CI/CD pipelines
- Data architecture, ETL/ELT design, and performance optimization
What You’re Looking For:
Join a fast-growing organization that thrives on innovation and collaboration. You’ll work alongside talented, motivated colleagues in a global environment, helping clients solve their most critical business challenges. At OZ, your contributions matter – you’ll have the chance to be a key player in our growth and success. If you’re driven, bold, and eager to push boundaries, we invite you to join a company where you can truly make a difference.
About Us:
OZ is a 28-year-old global technology consulting, services, and solutions leader specializing in creating business-focused solutions for our clients by leveraging disruptive digital technologies and innovation.
OZ is committed to creating a continuum between work and life by allowing people to work remotely. We offer competitive compensation and a comprehensive benefits package. You’ll enjoy our work style within an incredible culture. We’ll give you the tools you need to succeed so you can grow and develop with us and become part of a team that lives by its core values.
Hi Rameez here from Beaconfire. I hope you're doing well! We’re currently hiring for an exciting MERN/MEAN Developer role, and I wanted to reach out to see if you or someone in your network might be interested. This is a fantastic opportunity to work on high-impact projects using modern technologies in a collaborative and growth-oriented environment.
About the Company
BeaconFire is based in Central NJ, specializing in Software Development, Web Development, and Business Intelligence; looking for candidates with a strong background in Software Engineering or Computer Science for a Python/Node Developer position.
About the Role
The role involves developing websites and writing scalable, secure, maintainable code while collaborating with team members to achieve project goals.
Responsibilities
- Develop websites using HTML, CSS, Node.js, React.js, and Angular2+, among other tools;
- Write scalable, secure, maintainable code that powers our clients’ platforms;
- Create, deploy, and maintain automated system tests;
- Work with Testers to understand defects opened and resolves them in a timely manner;
- Supports continuous improvement by investigating alternatives and technologies and presenting these for architectural review;
- Collaborate effectively with other team members to accomplish shared user story and sprint goals;
- Invest time in constant professional development to stay up to date with new technological development and programming languages;
- Discover and fix programming bugs;
- Other duties as assigned.
Qualifications
- Proficient understanding of HTML and CSS;
- Experience in programming language JavaScript or similar (e.g. Java, Python, C, C++, C#, etc.) and understanding of the software development life cycle;
- Basic knowledge of code versioning (e.g. Git, SVN);
- A passion for coding pixel perfect web pages;
- Good verbal communication and interpersonal skills.
Required Skills
- Proficient understanding of HTML and CSS;
- Experience in programming language JavaScript or similar (e.g. Java, Python, C, C++, C#, etc.) and understanding of the software development life cycle;
- A passion for coding pixel perfect web pages;
- Good verbal communication and interpersonal skills.
Preferred Skills
- Bachelor's degree or higher in Computer Science or related fields;
- 0-1 year of practical experience in JavaScript coding;
- Familiarity with at least one JavaScript framework (Angular2+, React.js, Express.js);
- Experience with unit and integration testing of code, with an understanding of JavaScript testing frameworks like Jasmine, Cucumber, Mocha, and Karma;
- Experience providing REST/SOAP APIs for user interface consumption;
- Experience working within an Agile development methodology Scrum.
BeaconFire is an E-verified company and provides equal employment opportunities (visa sponsorship provided).
```
Role description
At Tata Technologies we make product development dreams a reality by designing, engineering, and validating the products of tomorrow for the world’s leading manufacturers. Due to our continued growth, we are now recruiting for a below position
Role Overview
We are looking for an experienced xIL Onsite Coordinator to lead and coordinate development and integration activities related to xIL platforms, test automation, CI/CD pipelines, and virtualization environments for automotive software validation.
This role will act as the technical interface between the customer and offshore engineering teams, ensuring smooth execution of HIL/SIL automation, DevOps integration, and platform development.
Key Responsibilities
- Coordinate development and integration of Python-based xIL automation libraries and frameworks.
- Support implementation of test automation frameworks (Robot Framework or similar) for automotive testing.
- Manage CI/CD pipelines for automated test execution, build, deployment, and reporting.
- Coordinate integration of HIL/SIL platforms and automotive test environments.
- Develop and maintain automation scripts and workflows using Python and YAML.
- Support development of REST APIs and backend services for platform integration.
- Work with automotive communication protocolssuch as CAN, UDS, and XCP.
- Act as the onsite technical coordinator between customer teams and offshore engineering teams.
- Troubleshoot platform, automation, and integration issues to ensure smooth project execution.
Required Skills
- Strong Python programming and automation development experience.
- Hands-on experience with CI/CD pipelines (GitHub Actions or similar tools).
- Experience with Robot Framework or other automation frameworks.
- Knowledge of HIL/SIL testing platforms and dSPACE toolchain.
- Experience with REST APIs and backend integrations.
- Knowledge of automotive communication protocols (CAN, UDS, XCP).
- Experience with Git version control.
- Strong communication and stakeholder coordination skills.
Good to Have
- Knowledge of ASAM standards and xIL APIs.
- Experience with automotive calibration tools (INCA, CANape).
- Exposure to cloud platforms (AWS/Azure/GCP).
- Experience with Docker/Kubernetes.
- Experience working in Agile/Scrum environments.
Equal Opportunity Statement:
Tata Technologies Inc. is an Equal Opportunity/ Affirmative Action employer. We provide equal employment opportunities to all qualified employees and applicants for employment without regard to race, religion, sex, age, marital status, national origin, sexual orientation, citizenship status, veteran status, disability, or any other legally protected status. We prohibit discrimination in decisions concerning recruitment, hiring, compensation, benefits, training, termination, promotions, or any other condition of employment or career development.
Tata Technologies: Engineering a better world.
Tata Technologies would like to thank all applicants for their interest, each application will be reviewed against the set criteria for the role. We would like to advise that only candidates under consideration will be contacted. If you do not hear from us within 10 working days following the closing date it will mean that unfortunately your application has not been successful. We will however retain your details for any suitable future opportunities.
Job Description
Role -: QA Automation Engineer
Location: Mount Laurel, NJ (Onsite)
We are looking for a highly skilled SDET / QA Automation Engineer with strong experience in Python, JavaScript, and modern automation frameworks to support automation solutions and end-to-end network validation.
Key Skills Required:
Python Automation
JavaScript
SDET / QA Automation
Automation Frameworks (PyTest / Selenium / Playwright / Cypress)
Microservices Testing
API Testing
Networking / Cable Technologies Knowledge
End-to-End System Validation
Responsibilities:
• Develop automation solutions and test scripts for network platforms
• Build and maintain automation frameworks using Python & JavaScript
• Validate end-to-end network components and behavior
• Develop automation microservices for testing infrastructure
• Collaborate with cross-functional teams and clients to ensure quality delivery
Duration: 12 months (possible extension)
Description:
Job Title: Senior AI Fullstack Engineer
The Position
- We advance science so that we all have more time with the people we love. Client’s Early Clinical Development (ECD) department is seeking a talented and motivated Software Developer reporting to the Director of software engineering at The Clinical Data Insights & Automation team (CDI&A).
- The CDI&A team collaborates with a broad range of stakeholders involved in the clinical development process (e.g., Clinical Science, Clinical Operations, Medical Writing, Quality, Regulatory). It develops industry-leading solutions for highly complex business problems. To manage the design and pilot of these software products, they apply a high degree of ingenuity and creativity while maintaining a finger on the pulses of the rapidly changing tech and healthcare landscapes.
- The Software Engineer will primarily be responsible for designing, developing, and deploying software which interacts with cutting-edge generative AI models and applications in collaboration with AI scientists, full stack developers, and others. Their work will directly impact our ability to create and deliver innovative solutions that leverage AI to solve complex problems and enhance user experiences.
The Opportunity:
- Innovate and develop software applications to support clinical development
- Identify and integrate AI/LLM capabilities to enhance data processing and natural workflows.
- Design intuitive, user-centric interfaces.
- Code Quality and Documentation: Write clean, maintainable, and well-documented code. Participate in code reviews and contribute to best practices in software development.
- Research and Innovation: Stay up-to-date with the latest advancements in generative AI and machine learning. Evaluate new technologies and methodologies to continuously improve our solutions.
- Collaborate with Cross-Functional Teams: Work closely with data scientists, engineers, and product managers to integrate generative AI capabilities into our products and services.
- Deployment and Monitoring: Develop and maintain robust deployment pipelines for AI-enhanced applications. Monitor pipeline performance in production and implement necessary improvements.
Who You Are:
- An experienced full stack developer capable of bringing your expertise to our existing and upcoming AI applications/projects as both a leader and individual contributor.
- Someone with a clear understanding of the current landscape of AI & AI-based applications, including potential benefits, limitations, and standard of practice.
Minimum Requirements:
- Bachelor’s or Master’s degree in Computer Science, Engineering, Mathematics, or a related field.
- 5+ years of full stack development experience
- Strong proficiency in either a front-end framework (Vue.js, React, or similar) and a backend web frameworks in Python and/or JavaScript (Django, FastAPI, Flask, Next.js, or similar)
- 4+ years experience with front-end frameworks (preferably Vue.js)
- 2+ years of developing and deploying AI/ML solutions or applications
- Experience designing and developing RESTful APIs (with e.g. Python FastAPI).
- Familiarity with prompt engineering
- Proficiency with containerized workflows and architectures (Podman, Docker, Kubernetes)
- Strong automated software testing skills (Python unittest, jest, Playwright)
- Familiar with Agile methodologies
- Excellent analytical and problem-solving skills with a track record of tackling complex technical challenges.
- Leading system design and implementing scalable, fault-tolerant solutions for complex, distributed computing challenges.
- Strong interpersonal and communication skills, with the ability to articulate complex technical concepts to non-technical stakeholders.
- Experience with cloud platforms (e.g. AWS) and modern data platforms (e.g., Snowflake).
- Experience implementing chatbots, retrieval-augmented generation (RAG) systems, and integrating LLMs into applications (AI-assisted automation)
Preferred Qualifications:
- Experience building AI agents, fine-tuning LLM models, and evaluating bias and fairness with LLM systems
- Experience in developing Microsoft Word add-ins using Office.js.
- Experience with web technologies like JWT, WebSockets, etc.
- Experience with Huggingface, Langchain, TensorFlow, PyTorch, or similar.
- Familiarity with DevOps, infrastructure, and continuous integration concepts.
- Familiarity with CRDT technologies like Yjs.
- Experience with using NLP/LLMs on clinical text.
- Basic knowledge of clinical drug development
About US Tech Solutions:
US Tech Solutions is a global staff augmentation firm providing a wide range of talent on-demand and total workforce solutions. To know more about US Tech Solutions, please visit Tech Solutions is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protected veteran.
Recruiter Details:
Name: Prateek Chaturvedi
Email:
Internal Id: 26-06298
Job Title: Senior Data Engineer
Location: Chicago, IL (Hybrid)
Department: Data & Analytics
Reports To: Head of Data Engineering / Data Platform Lead
Role Overview
We are seeking a highly skilled Senior Data Engineer with strong Python development expertise and deep experience in Snowflake to design, build, and optimize scalable enterprise data solutions. This role is based in Chicago, IL and will support regulatory and risk data initiatives in a highly governed environment.
The ideal candidate has hands-on experience building modern cloud data platforms and is familiar with risk management frameworks, BCBS 239 principles, and Governance, Risk & Compliance (GRC) requirements within financial services.
Key Responsibilities
Data Engineering & Architecture
Design, develop, and maintain scalable data pipelines using Python.
Build and optimize data models, transformations, and data marts within Snowflake.
Develop robust ELT/ETL frameworks for structured and semi-structured data.
Optimize Snowflake performance, cost efficiency, clustering, and workload management.
Implement automation, monitoring, and CI/CD for data pipelines.
Risk & Regulatory Data Management
Support regulatory reporting aligned with BCBS 239 (risk data aggregation and reporting).
Ensure data traceability, lineage, reconciliation, and auditability.
Implement controls aligned with Governance, Risk & Compliance (GRC) frameworks.
Partner with Risk, Finance, Compliance, and Audit teams to deliver accurate and governed data assets.
Data Governance & Quality
Develop and enforce data quality validation frameworks.
Maintain metadata, lineage documentation, and data catalog integration.
Implement data access controls and security best practices.
Technical Leadership
Provide mentorship and code reviews for data engineering team members.
Promote engineering best practices and documentation standards.
Collaborate cross-functionally with architects, analysts, and business stakeholders.
Required Qualifications
7+ years of experience in Data Engineering or Data Platform development.
Strong Python programming expertise (Pandas, PySpark, Airflow, etc.).
Hands-on experience with Snowflake (data modeling, Snowpipe, Streams & Tasks, performance tuning).
Advanced SQL skills and deep understanding of data warehousing concepts.
Experience supporting BCBS 239 compliance or similar regulatory reporting frameworks.
Experience working within Governance, Risk & Compliance (GRC) structures.
Experience in cloud environments (AWS, Azure, or GCP).
Strong understanding of data lineage, controls, reconciliation, and audit requirements.
Preferred Qualifications
Experience in banking, capital markets, or financial services.
Knowledge of credit risk, market risk, liquidity risk, or regulatory reporting domains.
Experience with data governance tools (Collibra, Alation, etc.).
Familiarity with DevOps practices, Docker, Kubernetes.
Experience building enterprise data platforms in highly regulated environments.
Key Competencies
Strong problem-solving and analytical thinking.
Ability to operate in a regulated, audit-driven environment.
Excellent communication and stakeholder management skills.
Detail-oriented with a focus on data accuracy and integrity.
Leadership mindset with hands-on technical capability.
Position: Lead JavaScript Engineer/Architect
Location: Charlotte, NC
Duration: 6 Months - CTH
Key skills: JavaScript, TypeScript, Node; Python (at least basic); to be able to design a trading system, performance tuning etc.
Job Description:
We are looking for an Architect/Lead Software Engineer to join our team. You will work with our product, design, and engineering teams to plan, design, and develop customer facing applications for credit cards. We offer an opportunity to work in a collaborative and inclusive environment with people who value their work and who welcome fresh ideas.
Key Responsibilities:
- Perform complex application programming activities with an emphasis on backend systems development: Node.JS, TypeScript, JavaScript, Python, RESTful APIs, Data Pipelines and more
- Lead the definition of system architecture and detailed solution design that are scalable and extensible
- Collaborate with Product Owners, Designers, and other engineers on different permutations to find the best solution possible
- Own the quality of code and do your own testing. Automate feature testing and contribute UI testing framework
- Become a subject matter expert for our mobile applications backend and middleware
- Deliver amazing solutions to production that knock everyone’s socks off
- Mentor developers on the team
- Aid technical team as needed
- Assist in interviewing and building out technical team
- Suggest improvements to optimize delivery
Basic Qualifications:
- Minimum B.S. / M.S. Computer Science or related discipline from accredited college or University
- At least 8+ years of experience designing, developing, and delivering backend applications with Node.JS, TypeScript, JavaScript, Python, Restful APIs, Data Pipelines and related backend frameworks
- At least 8 years of experience building internet facing applications
- At least 8 years of experience with known Cloud and/or OpenShift, preferably AWS
- Proficient in following concepts: object-oriented programming, software engineering techniques, quality engineering, parallel programming, databases, etc.
- Proficient in building and consuming RESTful APIs
- Proficient in managing multiple tasks and consistently meet established timelines
- Experience integrating APIs with front-end and/or mobile-specific frameworks
- Strong collaboration skills
- Excellent written and verbal communications skills
We Are Hiring: Databricks Lead Data Engineer – Director Equivalent Role
Location: Atlanta, USA
Work Model: Hybrid – 3 to 4 days in office per week (mandatory)
Eligibility: US Citizens and Green Card (GC) holders only
How to Apply
If you are interested in this position and have the required skills, please send across your resume at:
; ;
Paves Technologies is seeking a highly experienced Databricks Lead Data Engineer – Lead Level (Director Equivalent Role) to drive enterprise-scale data architecture, governance, and advanced analytics initiatives on Azure Cloud. This is a senior leadership role requiring deep Databricks expertise, strong data modeling capabilities, and hands-on architectural ownership across PySpark based distributed systems.
Role Overview
The ideal candidate will bring 10-12 + years of overall data engineering experience, including strong hands-on expertise with Azure Databricks, PySpark, Python, and Azure Cloud data services. You will define architecture standards, lead modernization initiatives, and implement scalable Medallion Architecture (Bronze, Silver, Gold layers) to support enterprise analytics and business intelligence.
Key Responsibilities
- Lead end-to-end architecture and implementation of enterprise-scale data platforms using Azure Databricks on Azure Cloud.
- Design and implement Medallion Architecture (Bronze, Silver, Gold layers) using Delta Lake best practices.
- Build scalable PySpark-based ETL/ELT pipelines across ingestion (Bronze), transformation (Silver), and curated analytics (Gold) layers.
- Develop advanced data transformations using Python, PySpark, Spark SQL, and advanced SQL constructs.
- Architect robust data models (dimensional, star schema, normalized models) aligned to analytics and reporting needs.
- Drive adoption of advanced Databricks capabilities including Unity Catalog, Declarative Pipelines, Delta Lake optimization, and governance frameworks.
- Establish best practices for partitioning strategies, file compaction, Z-ordering, caching, broadcast joins, and query optimization.
- Define and standardize reusable Azure Cloud data platform tools, templates, CI/CD frameworks, and infrastructure automation.
- Work across Azure ecosystem components such as Azure Data Factory (ADF), Azure Data Lake Storage (ADLS), Azure DevOps, networking, and security services.
- Ensure high standards for data quality, RBAC, lineage tracking, governance, and production stability.
- Provide architectural leadership and mentorship to data engineering teams.
Required Experience & Skills
- 10–12+ years of overall experience in Data Engineering.
- Minimum 3+ years of strong hands-on Databricks experience.
- Mandatory Certifications:
- Databricks Certified Data Engineer Associate
- Databricks Certified Data Engineer Professional
- Deep hands-on expertise in PySpark, Python programming, and distributed Spark processing.
- Strong experience designing and implementing Medallion Architecture (Bronze/Silver/Gold layers).
- Advanced knowledge of Data Modeling, Data Analysis, and complex SQL (window functions, CTEs, execution plan tuning).
- Strong understanding of Delta Lake architecture, schema evolution, partition strategies, performance optimization, and data governance.
- Well-versed in enterprise Azure Cloud data platforms, reusable accelerators, CI/CD templates, and governance standards.
- Proven experience architecting scalable, secure, cloud-native data solutions.
- Strong leadership, stakeholder management, and executive communication skills.
How to Apply
If you are interested in this position and have the required skills, please send across your resume at:
; ;
Akkodis is seeking a Platform Software Engineer role is a Full-Time with a client located in Allentown, Cincinnati, Chicago, (Occasionally Onsite). Ideally looking for applicants to have a solid background in embedded systems/Payroll, Python/Django, AWS, API, React. would come as a big plus.
Salary Range: $165k-$190k/Annum + Benefits, The salary may be negotiable based on experience, education, geographic location, and other factors.
Key Responsibilities
Customer Collaboration & Developer Relations
- Serve as the primary technical interface for customers, partners, and indirect channel teams.
- Host technical discussions, requirement? Gathering sessions, and architecture reviews directly with customers.
- Provide developer support, guidance, and best practices for integration and implementation.
- Represent the embedded team in developer? Facing demos, product enablement, and partner technical workshops.
- Translate customer needs into actionable engineering tasks, ensuring platform capabilities align with business goals.
Platform Customization & Embedded Engineering
- Lead the design and implementation of custom features, enhancements, and integrations across our platform.
- Develop application? Layer components and service extensions using Python, Django, and REST? Based APIs.
- Build and integrate customer? Facing portal/UI components in React.
- Leverage AWS services to build scalable, cloud-backed integrations.
Architecture, API & Systems Integration
- Own the documentation and development of APIs, SDKs, and interface design documents (IDDs).
- Ensure seamless integration between on?device components, microservices, and customer systems.
Technical Troubleshooting & Tier?3 Support
- Diagnose and resolve complex issues.
- Provide senior? Level technical escalation support for customers and internal teams.
- Lead root?cause analysis and drive long? Term corrective improvements across the platform.
Project Delivery & Leadership
- Lead customer? Specific development initiatives from ideation through deployment.
- Own timelines, deliverables, and technical direction for customization projects.
- Mentor junior engineers and influence cross? Team engineering best practices.
Required Skills & Qualifications
- Expertise in Python, with experience building backend services or integration layers (Django strongly preferred).
- Proficiency in React for developing customer?facing interfaces or partner tools.
- Experience with AWS (Lambda, API Gateway, S3, CloudWatch, IAM, or related services).
- Strong experience with embedded systems development.
- Experience with API design, interface documentation, and integration workflows.
- Excellent communication skills with the ability to convey complex technical concepts to diverse audiences, including customers and indirect channel partners.
- Proven success in customer-facing engineering, developer support, or solutions engineering roles.
- Bachelor’s degree in Computer Engineering, Electrical Engineering, Computer Science, or related field.
Preferred Qualifications
- Experience in Developer Relations, technical evangelism, or technical account management.
- Experience supporting indirect channel partners, VARs, ISVs, or system integrators.
- Background in pre?sales engineering, partner enablement, or customer?led solution design.
If you are interested in this role, then please click APPLY NOW. For other opportunities available at Akkodis, or any questions, feel free to contact me at
Equal Opportunity Employer/Veterans/Disabled
Benefits offerings include but are not limited to:
• 401(k) with match
• Medical insurance
• Dental Insurance
• Vision assistance
• Paid Holidays Off
To read our Candidate Privacy Information Statement, which explains how we will use your information, please visit Company will consider qualified applicants with arrest and conviction records in accordance with federal, state, and local laws and/or security clearance requirements, including, as applicable:
· The California Fair Chance Act
· Los Angeles City Fair Chance Ordinance
· Los Angeles County Fair Chance Ordinance for Employers
· San Francisco Fair Chance Ordinance
This is a multi-year contract with our direct client.
No third parties!
The Early Clinical Development (ECD) department with our South San Francisco, CA client is seeking a talented and motivated Senior AI Full Stack Engineer reporting to the Director of software engineering at The Clinical Data Insights & Automation team (CDI&A). The CDI&A team collaborates with a broad range of stakeholders involved in the clinical development process (e.g., Clinical Science, Clinical Operations, Medical Writing, Quality, Regulatory). It develops industry-leading solutions for highly complex business problems. To manage the design and pilot of these software products, they apply a high degree of ingenuity and creativity while maintaining a finger on the pulses of the rapidly changing tech and healthcare landscapes.
The Software Engineer will primarily be responsible for designing, developing, and deploying software which interacts with cutting-edge generative AI models and applications in collaboration with AI scientists, full stack developers, and others. Their work will directly impact our ability to create and deliver innovative solutions that leverage AI to solve complex problems and enhance user experiences.
The Opportunity:
- Innovate and develop software applications to support clinical development
- Identify and integrate AI/LLM capabilities to enhance data processing and natural workflows.
- Design intuitive, user-centric interfaces.
- Code Quality and Documentation: Write clean, maintainable, and well-documented code. Participate in code reviews and contribute to best practices in software development.
- Research and Innovation: Stay up-to-date with the latest advancements in generative AI and machine learning. Evaluate new technologies and methodologies to continuously improve our solutions.
- Collaborate with Cross-Functional Teams: Work closely with data scientists, engineers, and product managers to integrate generative AI capabilities into our products and services.
- Deployment and Monitoring: Develop and maintain robust deployment pipelines for AI-enhanced applications. Monitor pipeline performance in production and implement necessary improvements.
Who You Are:
- An experienced full stack developer capable of bringing your expertise to our existing and upcoming AI applications/projects as both a leader and individual contributor.
- Someone with a clear understanding of the current landscape of AI & AI-based applications, including potential benefits, limitations, and standard of practice.
Minimum Requirements:
- Bachelor’s or Master’s degree in Computer Science, Engineering, Mathematics, or a related field.
- 5+ years of full stack development experience
- Strong proficiency in either a front-end framework (js, React, or similar) and a backend web frameworks in Python and/or JavaScript (Django, FastAPI, Flask, Next.js, or similar)
- 4+ years experience with front-end frameworks (preferably js)
- 2+ years of developing and deploying AI/ML solutions or applications
- Experience designing and developing RESTful APIs (with e.g. Python FastAPI).
- Familiarity with prompt engineering
- Proficiency with containerized workflows and architectures (Podman, Docker, Kubernetes)
- Strong automated software testing skills (Python unittest, jest, Playwright)
- Familiar with Agile methodologies
- Excellent analytical and problem-solving skills with a track record of tackling complex technical challenges.
- Leading system design and implementing scalable, fault-tolerant solutions for complex, distributed computing challenges.
- Strong interpersonal and communication skills, with the ability to articulate complex technical concepts to non-technical stakeholders.
- Experience with cloud platforms (e.g. AWS) and modern data platforms (e.g., Snowflake).
- Experience implementing chatbots, retrieval-augmented generation (RAG) systems, and integrating LLMs into applications (AI-assisted automation)
Preferred Qualifications:
- Experience building AI agents, fine-tuning LLM models, and evaluating bias and fairness with LLM systems
- Experience in developing Microsoft Word add-ins using js.
- Experience with web technologies like JWT, WebSockets, etc.
- Experience with Huggingface, Langchain, TensorFlow, PyTorch, or similar.
- Familiarity with DevOps, infrastructure, and continuous integration concepts.
- Familiarity with CRDT technologies like Yjs.
- Experience with using NLP/LLMs on clinical text.
- Basic knowledge of clinical drug development