Define An Array In Data Structure Jobs in Usa
39,851 positions found — Page 5
Duration-: 10+ Months
Location: Remote
Overview
An experienced Solution Architect to lead the enterprise rollout of Microsoft Purview across a complex global, multi cloud environment. The consultant will define architecture, implement domain?based governance, and drive adoption of Purview capabilities including cataloging, lineage, classification, access governance, and compliance controls.
Key Responsibilities
- Architecture & Implementation
- Define target?state architecture for Microsoft Purview across Azure, AWS, M365, on prem, and third party platforms.
- Develop and drive the implementation roadmap across U.S. Businesses, PGIM, Corporate Technology, and international units.
- Establish Purview reference architecture, integration patterns, and guardrails.
- Domain Based Governance
- Design collections, hierarchies, and RBAC aligned to domain structures and legal entity boundaries.
- Enable domain owned stewardship while enforcing enterprise taxonomies and governance standards.
- Platform Configuration
- Configure Data Map, Catalog, Scans, Classifications, Sensitivity Labels, and Lineage.
- Optimize scan strategy (frequency, cost, performance) and extend classifiers and metadata models.
- Security & Compliance
- Integrate Purview with M365 Information Protection, Entra ID, and security baselines.
- Support PII/PCI/PHI detection, access governance, and regulatory compliance (SOX, GLBA, NYDFS, GDPR).
- Engineering & Integration
- Integrate with Synapse, Fabric, Databricks (including Unity Catalog), Snowflake, SQL Server, AWS sources, and SAP/Oracle.
- Implement IaC (Bicep/Terraform), CI/CD for Purview artifacts, and automation via APIs.
- Adoption & Stakeholder Management
- Deliver training, onboarding playbooks, and steward enablement.
- Lead workshops for new data domains and products.
- Provide executive level reporting on progress, risks, and KPIs.
Required Qualifications
- 10+ years in data architecture/governance; 2+ years hands on Purview experience at enterprise scale.
- Strong expertise in metadata management, lineage, classification, scan optimization, glossary management and domain based operating models.
- Solid Azure ecosystem knowledge (Storage, Key Vault, Synapse, Fabric, Databricks), M365 Information Protection, and Entra ID.
- Experience with IaC (Bicep/Terraform), APIs/Atlas, and scripting (PowerShell/Python).
- Financial services or regulated industry exposure.
- Excellent communication, stakeholder leadership, and cross domain facilitation skills.
Project Manager – Data Center Construction
Top‑ranked ENR Electrical Contractor
Multiple U.S. Locations:
Columbus, Ohio - Cedar Rapids, Iowa - Port Washington, WI - Santa Teresa, NM - Sparks, NV.
Data Center Construction
For decades, our client has delivered some of the most complex, high‑profile electrical projects in the United States. Their Data Center Division builds mission‑critical facilities nationwide -installing and commissioning electrical systems that power millions of users every day. Their culture is defined by integrity, safety, innovation, and doing the right thing even when no one is watching.
Salary Range: $120,000–$160,000 base
Bonus: Eligible for annual performance bonus
(Compensation varies by market and experience, supported by a robust benefits program.)
The Position: Project Manager - Data Center
The successful applicant will assume responsibility for:
- Leading electrical data center construction projects from qualification through estimating, pre‑con, mobilization, execution, and closeout.
- Ensuring all project and contractual requirements are fulfilled safely, professionally, and within budget and schedule.
- Developing the project execution plan, and managing scheduling and coordination with the Superintendent and field teams.
- Identifying risks, analyzing issues, and implementing recovery action plans to protect schedule and budget.
- Managing full project financials: invoice approval, job cost tracking, productivity monitoring, variance reporting, and financial performance reviews.
- Negotiating change orders effectively to maintain scope clarity and profitability.
- Serving as the primary point of contact for customers, end‑user owners, and subcontractors; developing and maintaining strong external relationships.
- Sourcing and qualifying new project opportunities.
- Building and mentoring a high‑performance project team, ensuring employee development and long‑term growth.
- Exercising sound judgment within established policies; decisions directly influence schedules, outcomes, and project success.
The Candidate
Applicants will be expected to demonstrate:
- 5+ years of experience in electrical construction or related fields, including leadership of multi‑million‑dollar projects.
- 5+ years managing people and electrical construction projects across commercial, utility, alternative energy, or data center markets.
- Strong knowledge of estimating, cost accounting, scheduling, procurement, productivity tracking, and reporting.
- Experience working with union labor, labor rate structures, and collective bargaining agreements.
- A proven ability to lead teams, manage risk, influence stakeholders, and drive projects to successful completion.
- Education: High School Diploma or GED required; Bachelor’s in Construction Management, Engineering, Business, or similar preferred.
- Valid driver’s license and satisfactory driving record.
- Must be authorized to work in the U.S. (no sponsorship available).
The Company
Our client is a top‑ranked ENR electrical contractor known for:
- Delivering complex data center builds with excellence, safety, and integrity.
- Buildingmillion's of sq. ft. of mission‑critical data center space valued at over $2 Billion in electrical systems.
- A culture rooted in people-focused leadership, innovation, and long-term career development.
This is a place where employees have real impact, contribute to meaningful work, and grow their careers on challenging, high‑visibility projects.
Compensation & Benefits
- Competitive base salary: $120,000 - $160,000
- Annual performance bonus eligible
- Market‑adjusted compensation aligned with experience and credentials
- Comprehensive, robust benefits program (medical, dental, vision, etc.)
- Strong emphasis on employee development and advancement
Submit your resume along with a detailed project list showcasing your experience in Data Center Construction.
If you have questions, please get in touch on either:
Email:
Cell: (917) 7464831
Location: Dallas TX or McLean VA
Cliff W2
Inperson interview
Onsite
- 5+ years in data science, analytics, or cloud financial operations
- Expertise in Python, SQL, and data science libraries (e.g., pandas, scikit-learn)
- Strong statistical modeling and machine learning skills
- Deep understanding of Azure and AWS cost structures and optimization levers
- Excellent communication and stakeholder engagement skills
- Experience with BI tools (Power BI, Tableau)
Your role and responsibilities
About the Opportunity
IBM Consulting is seeking an accomplished Data & Analytics Associate Partner to accelerate our growth within the Industrial & Communications sectors. This executive role is responsible for shaping client vision, cultivating senior executive relationships, and developing data-driven solutions that enable clients to successfully navigate complex transformation programs.
You will bring together deep industry expertise and IBM’s portfolio of data, analytics, and AI capabilities to help organizations modernize their data ecosystems—migrating from legacy platforms to modern hybrid cloud architectures—while adopting next-generation analytics, GenAI, and agentic AI to strengthen decision-making and deliver measurable business and financial outcomes.
This role is ideal for a seasoned leader who integrates industry depth, consulting excellence, and technical thought leadership, has a strong understanding of competitive market dynamics, and consistently delivers high-impact transformation at scale.
Key Responsibilities
Market Leadership & Growth
Expand IBM’s Data & Analytics presence by identifying new market opportunities, developing differentiated solutions, and building a strong pipeline.
Engage senior client executives to understand strategic priorities and shape data transformation roadmaps aligned to their business and financial goals.
Lead end-to-end sales cycles, including solution definition, proposal leadership, financial structuring, and contract negotiation.
Strategic Advisory & Transformation Delivery
Advise C-suite leaders on strategies to their data estate modernization, advanced analytics, GenAI, and agentic AI to drive business performance.
Architect integrated solutions that include:
Migration from legacy data platforms to modern cloud-based architectures
Data engineering and Information governance
Business intelligence and advanced analytics
GenAI-powered and agentic AI-driven automation and decisioning
Lead complex transformation programs from discovery through delivery, ensuring measurable outcomes and client satisfaction.
Engagement Excellence & Financial Stewardship
Oversee multi-disciplinary delivery teams to ensure high-quality, consistent execution across all program phases.
Manage engagement financials, including forecasting, margin performance, and overall portfolio profitability.
Align right client technologies, industry expertise, and global delivery capabilities to maximize client value.
Practice Building & Talent Development
Recruit, mentor, and grow top-tier consultants, architects, and data specialists.
Build and scale capabilities in data modernization, cloud data engineering, analytics, GenAI, and emerging agentic AI techniques.
Contribute to practice strategy, offering development, and capability growth across the global Data & Analytics team.
Thought Leadership & Market Presence
Stay ahead of sector and technology trends, including cloud modernization, GenAI, agentic system design, regulatory changes, and evolving competitive dynamics.
Represent IBM at industry conferences, client events, webinars, and executive roundtables.
Create original thought leadership—articles, perspectives, point-of-views—that positions IBM as a leading advisor in data and AI-driven transformation.
This position can be preformed anywhere in the US.
"Leaders are expected to spend time with their teams and clients and therefore are generally expected to be in the workplace a minimum of three days a week, subject to business needs."
Required technical and professional expertise
Qualifications
12+ years of experience in consulting, data strategy, analytics, or digital transformation, with strong exposure to the Industrial or Communications sectors.
Hands-on experience modernizing data ecosystems, including migrating from legacy on-premise platforms to modern cloud-native or hybrid cloud architectures.
Deep expertise with major cloud platforms and their data/analytics stacks, including implementation experience with:
AWS (e.g., Redshift, S3, Glue, EMR, Athena, Lake Formation, Bedrock, SageMaker)
Microsoft Azure (e.g., Azure Data Lake, Synapse, Data Factory, Databricks on Azure, Fabric, Cognitive Services)
Google Cloud Platform (e.g., BigQuery, Cloud Storage, Dataflow, Dataproc, Vertex AI)
Experience designing and implementing end-to-end data pipelines, governance frameworks, and analytics solutions on one or more of these platforms.
Strong understanding of GenAI architectures, LLM integration patterns, vector databases, retrieval-augmented generation (RAG), and emerging agentic AI frameworks.
Proven track record of selling, structuring, and delivering large-scale data and AI transformation programs.
Robust technical and functional expertise in data engineering, cloud data platforms, analytics, AI/ML, information management, and governance.
Executive-level communication and presence, with demonstrated ability to influence senior stakeholders and convey complex topics through compelling narratives.
Financial management experience, including engagement economics, forecasting, margin optimization, and portfolio profitability.
Demonstrated leadership in building, scaling, and developing high-performing consulting and technical teams.
Preferred technical and professional experience
IBM is committed to creating a diverse environment and is proud to be an equal-opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, gender, gender identity or expression, sexual orientation, national origin, caste, genetics, pregnancy, disability, neurodivergence, age, veteran status, or other characteristics. IBM is also committed to compliance with all fair employment practices regarding citizenship and immigration status.
#J-18808-Ljbffr
About Wakefern
Wakefern Food Corp. is the largest retailer-owned cooperative in the United States and supports its co-operative members' retail operations, trading under the ShopRite®, Price Rite®, The Fresh Grocer®, Dearborn Markets®, and Gourmet Garage® banners.
Employing an innovative approach to wholesale business services, Wakefern focuses on helping the independent retailer compete in a big business world. Providing the tools entrepreneurs need to stay a step ahead of the competition, Wakefern’s co-operative members benefit from the company’s extensive portfolio of services, including innovative technology, private label development, and best-in-class procurement practices.
The ideal candidate will have a strong background in designing, developing, and implementing complex projects, with focus on automating data processes and driving efficiency within the organization. This role requires a close collaboration with application developers, data engineers, data analysts, data scientists to ensure seamless data integration and automation across various platforms. The Data Integration & AI Engineer is responsible for identifying opportunities to automate repetitive data processes, reduce manual intervention, and improve overall data accessibility.
Essential Functions
- Participate in the development life cycle (requirements definition, project approval, design, development, and implementation) and maintenance of the systems.
- Implement and enforce data quality and governance standards to ensure the accuracy and consistency.
- Provide input for project plans and timelines to align with business objectives.
- Monitor project progress, identify risks, and implement mitigation strategies.
- Work with cross-functional teams and ensure effective communication and collaboration.
- Provide regular updates to the management team.
- Follow the standards and procedures according to Architecture Review Board best practices, revising standards and procedures as requirements change and technological advancements are incorporated into the >tech_ structure.
- Communicates and promotes the code of ethics and business conduct.
- Ensures completion of required company compliance training programs.
- Is trained – either through formal education or through experience – in software / hardware technologies and development methodologies.
- Stays current through personal development and professional and industry organizations.
Responsibilities
- Design, build, and maintain automated data pipelines and ETL processes to ensure scalability, efficiency, and reliability across data operations.
- Develop and implement robust data integration solutions to streamline data flow between diverse systems and databases.
- Continuously optimize data workflows and automation processes to enhance performance, scalability, and maintainability.
- Design and develop end-to-end data solutions utilizing modern technologies, including scripting languages, databases, APIs, and cloud platforms.
- Ensure data solutions and data sources meet quality, security, and compliance standards.
- Monitor and troubleshoot automation workflows, proactively identifying and resolving issues to minimize downtime.
- Provide technical training, documentation, and ongoing support to end users of data automation systems.
- Prepare and maintain comprehensive technical documentation, including solution designs, specifications, and operational procedures.
Qualifications
- A bachelor's degree or higher in computer science, information systems, or a related field.
- Hands-on experience with cloud data platforms (e.g., GCP, Azure, etc.)
- Strong knowledge and skills in data automation technologies, such as Python, SQL, ETL/ELT tools, Kafka, APIs, cloud data pipelines, etc.
- Experience in GCP BigQuery, Dataflow, Pub/Sub, and Cloud storage.
- Experience with workflow orchestration tools such as Cloud Composer or Airflow
- Proficiency in iPaaS (Integration Platform as a Service) platforms, such as Boomi, SAP BTP, etc.
- Develop and manage data integrations for AI agents, connecting them to internal and external APIs, databases, and knowledge sources to expand their capabilities.
- Build and maintain scalable Retrieval-Augmented Generation (RAG) pipelines, including the curation and indexing of knowledge bases in vector databases (e.g., Pinecone, Vertex AI Vector Search).
- Leverage cloud-based AI/ML platforms (e.g., Vertex AI, Azure ML) to build, train, and deploy machine learning models on a scale.
- Establish and enforce data quality and governance standards for AI/ML datasets, ensuring the accuracy, completeness, and integrity of data used for model training and validation.
- Collaborate closely with data scientists and machine learning engineers to understand data requirements and deliver optimized data solutions that support the entire machine learning lifecycle.
- Hands-on experience with IBM DataStage and Alteryx is a plus.
- Strong understanding of database design principles, including normalization, indexing, partitioning, and query optimization.
- Ability to design and maintain efficient, scalable, and well-structured database schemas to support both analytical and transactional workloads,
- Familiarity with BI visualization tools such as MicroStrategy, Power BI, Looker, or similar.
- Familiarity with data modeling tools.
- Familiarity with DevOps practices for data (CI/CD pipelines)
- Proficiency in project management software (e.g., JIRA, Clarizen, etc.)
- Familiarity with DevOps practices for data (CI/CD pipelines)
- Strong knowledge and skills in data management, data quality, and data governance.
- Strong communication, collaboration, and problem-solving skills.
- Ability to work on multiple projects and prioritize tasks effectively.
- Ability to work independently and in a team environment.
- Ability to learn new technologies and tools quickly.
- The ability to handle stressful situations.
- Highly developed business acuity and acumen.
- Strong critical thinking and decision-making skills.
Working Conditions & Physical Demands
This position requires in-person office presence at least 4x a week.
Compensation and Benefits
The salary range for this position is $75,868 - $150,644. Placement in the range depends on several factors, including experience, skills, education, geography, and budget considerations.
Wakefern is proud to offer a comprehensive benefits package designed to support the health, well-being, and professional development of our Associates. Benefits include medical, dental, and vision coverage, life and disability insurance, a 401(k) retirement plan with company match & annual company contribution, paid time off, holidays, and parental leave.
Associates also enjoy access to wellness and family support programs, fitness reimbursement, educational and training opportunities through our corporate university, and a collaborative, team-oriented work environment. Many of these benefits are fully or partially funded by the company, with some subject to eligibility requirements
Electrical Engineer - Data Centers - San Francisco
Metric DCX are partnered with a global engineering and consultancy firm to support the continued growth of their data center division.
This Electrical Engineer position will specialize in data center facility design to be embedded directly with a major end-user client.
Responsibilities:
- Assessing third-party and colocation facilities being considered for acquisition, evaluating their suitability against the client's portfolio requirements.
- Taking ownership of power systems across all project phases, identifying and resolving issues as they arise in collaboration with the relevant client stakeholders.
- Reviewing data center designs with a critical eye on redundancy architecture, availability targets, and potential single points of failure.
- Working closely with operations, planning, and energy strategy teams to push electrical solutions forward on third-party data center projects.
- Conducting technical due diligence and maintaining quality standards in line with client expectations.
- Keeping internal documentation, specs, and standards current based on live project feedback and lessons learned.
- Liaising with internal teams on power loading, rack deployment, and load balancing within shared facilities.
- Contributing to cross-discipline coordination with mechanical and controls engineers, and supporting consistency across regional teams.
Background Required
- Degree-qualified in Electrical Engineering; a postgraduate qualification or PE license would be a strong advantage.
- At least five years working within mission-critical environments, with solid hands-on exposure to colocation and multi-tenant data center projects specifically.
- Confident in power systems analysis and the software tools that come with it.
- Practical experience across the full electrical distribution stack — from high voltage transformers down to branch circuits — covering design, procurement, commissioning, and operations.
- Comfortable working across disciplines and engaging with structural, mechanical, civil, and IT/Telecom teams as needed.
- Grounded in US electrical codes and standards, with some awareness of IEC standards beneficial.
Staff Structural Engineer
Houston, TX | Full-Time | Engineering
We are seeking a Staff Structural Engineer to join a growing engineering team in Houston, TX. This role offers the opportunity to contribute to a wide range of structural design projects, including new construction and retrofit work, while collaborating with experienced engineers and multidisciplinary teams.
The ideal candidate will have a strong foundation in structural analysis, modeling, and design, along with experience using industry-standard engineering software and tools.
Key Responsibilities
- Assist with structural project design and analysis including PLS-Tower modeling, FEM structure modeling, and reinforcement design
- Support new construction and structural retrofit projects
- Contribute to the development of project budgets, schedules, and man-hour estimates
- Assist with project deliverables from conceptual design through detailed design
- Prepare and develop engineering drawings, layouts, and calculations
- Execute design modifications based on redlines, markups, and project changes
- Analyze reports, maps, drawings, and structural data to support project planning and design
- Apply engineering codes and specifications to ensure compliance with design requirements
- Review drawings and project documentation for quality assurance within scope, schedule, and budget
- Maintain organized documentation of 3D models, drawings, and project files
- Participate in structural design quality review processes, including back-checking drawings and reviewing shop drawings
- Collaborate with cross-functional teams to support project success
Required Qualifications
- Bachelor’s Degree in Structural Engineering or related field (ABET-accredited) with 3+ years of structural engineering experience, OR
- Bachelor’s Degree in Structural or Engineering Technology (ABET-accredited) with FE certification and 3+ years of experience, OR
- Master’s Degree in Structural Engineering with 2+ years of experience
Preferred Skills & Experience
- Strong knowledge of structural engineering principles, methods, and procedures
- Experience with structural analysis and modeling tools such as:
- RISA-3D
- ETABS
- SAFE
- Experience with design and modeling software including:
- AutoCAD
- Revit
- Tekla (BIM tools)
- Proficiency in Microsoft Office Suite
- Strong analytical, problem-solving, and critical thinking skills
- Excellent written and verbal communication abilities
- Strong attention to detail and ability to work in collaborative engineering teams
- Engineer in Training (EIT) certification preferred
Why Join?
- Opportunity to work on complex and impactful structural projects
- Collaborative and technical engineering-focused environment
- Exposure to advanced structural modeling and analysis tools
- Long-term career growth within a dynamic engineering team
Interested candidates are encouraged to apply or connect to learn more about this opportunity.
Job Summary:
Our client is seeking a Data Steward to join their team! This position is located Hybrid in Creve Coeur, Missouri.
Duties:
- Understand business capability needs and processes as they relate to IT solutions through partnering with Product Managers and business and functional IT stakeholders
- Participate in data scraping, data curation and data compilation efforts
- Ensure high quality of the data to end users
- Ensure high quality of the inhouse data via data stewardship
- Implement and utilize data solutions for data analysis and profiling using a variety of tools such as SQL, Postman, R, or Python and following the team’s established processes and methodologies
- Collaborate with other data stewards and engineers within the team and across teams on aligning delivery dates and integration efforts
- Define data quality rules and implement automated monitoring, reporting, and remediation solutions
- Coordinate intake and resolution of data support tickets
- Support data migration from legacy systems, data inserts and updates not supported by applications
- Partner with the Data Governance organization to ensure data is secured and access is being managed appropriately
- Identify gaps within existing processes and capable of creating new documentation templates to improve the existing processes and procedures
- Create mapping documents and templates to improve existing manual processes
- Perform data discoveries to understand data formats, source systems, etc. and engage with business partners in this discovery process
- Help answer questions from the end-users and coordinate with technical resources as needed
- Build prototype SQL and continuously engage with end consumers with enhancements
Desired Skills/Experience:
- Bachelor's Degree in Computer Science, Engineering, Science, or other related field
- Applied experience with modern engineering technologies and data principles, for instance: Big Data Cloud Compute, NoSQL, etc..
- Applied experience with querying SQL and/orNoSQL databases
- Experience in designing data catalogs, including data design, metadata structures, object relations, catalog population, etc.
- Data Warehousing experience
- Strong written and verbal communication skills
- Comfortable balancing demands across multiple projects / initiatives
- Ability to identify gaps in requirements based on business subject matter domain expertise
- Ability to deliver detailed technical documentation
- Expert level experience in relevant business domain
- Experience managing data within SAP
- Experience managing data using APIs
- Big Query experience
Benefits:
- Medical, Dental, & Vision Insurance Plans
- Employee-Owned Profit Sharing (ESOP)
- 401K offered
The approximate pay range for this position starting at $104,000 - $115,000+ Please note that the pay range provided is a good faith estimate. Final compensation may vary based on factors including but not limited to background, knowledge, skills, and location. We comply with local wage minimums.
At KellyMitchell, our culture is world class. We’re movers and shakers! We don’t mind a bit of friendly competition, and we reward hard work with unlimited potential for growth. This is an exciting opportunity to join a company known for innovative solutions and unsurpassed customer service. We're passionate about helping companies solve their biggest IT staffing & project solutions challenges. As an employee-owned, women-led organization serving Fortune 500 companies nationwide, we deliver expert service at a moment's notice.
By applying for this job, you agree to receive calls, AI-generated calls, text messages, or emails from KellyMitchell and its affiliates, and contracted partners. Frequency varies for text messages. Message and data rates may apply. Carriers are not liable for delayed or undelivered messages. You can reply STOP to cancel and HELP for help. You can access our privacy policy at
Duration: 6+ months
Location: 100% Remote
Job Overview
The Marketplace Data Product Engineer serves as the primary technical facilitator, and adoption champion for the Marketplace platform. This role bridges engineering, product, and business domains - leading workshops, demos, onboarding sessions, and cross?domain engagements to accelerate Marketplace adoption. You will configure demo environments, support development, translate complex technical concepts for business audiences, gather product feedback, and partner closely with product and engineering teams to shape the Marketplace roadmap. This will guide domains through the process of understanding, showcasing, and maturing their data products within the ecosystem.
Key Responsibilities
- Facilitate workshops, demos, onboarding sessions, and cross?domain engagements to drive Marketplace adoption.
- Serve as the primary technical presenter of the Marketplace for domain teams and stakeholders.
- Engage with domain owners to understand their data products, help refine their articulation, and showcase how they integrate into the Marketplace ecosystem.
- Configure and maintain demo environments for Marketplace capabilities, data products, and new features.
- Support light development, proof?of?concept configurations, and sample integrations to demonstrate platform capabilities.
- Translate technical Marketplace concepts into clear, business?friendly language for non?technical audiences.
- Collect structured feedback from domain teams, synthesize insights, and partner with product and engineering to influence the roadmap.
- Develop and refine training materials, demos, playbooks, and onboarding assets to support continuous adoption.
- Act as an advocate for domains, ensuring their data product needs and challenges are well represented in Marketplace planning.
- Support ongoing adoption initiatives, including community sessions, office hours, and cross?domain knowledge sharing.
Required Skills & Qualifications
- 4-7+ years of experience in data engineering, platform engineering, solution engineering, technical consulting, or similar roles.
- Strong understanding of data products, data modeling concepts, data APIs, enterprise integrations and metadata?driven architectures.
- Ability to configure and demonstrate platform features, build light proofs?of?concept, and support technical onboarding.
- Excellent communication and presentation skills, with experience translating technical concepts for business partners.
- Experience facilitating workshops, leading demos, or driving customer/product adoption initiatives.
- Ability to engage domain teams, understand their data product needs, and help articulate value within a larger ecosystem.
- Strong collaboration and stakeholder management skills across engineering, product, and business teams.
- Comfortable working in fast?moving environments and driving clarity through ambiguity.
Preferred Qualifications
- Experience with data product and governance frameworks, data marketplaces, data mesh concepts, or platform adoption roles.
- Hands?on experience with cloud data platforms (Azure, AWS, or GCP), data pipelines, or integration tooling.
- Familiarity with REST/GraphQL APIs, event-driven patterns, and data ingestion workflows.
- Background in solution architecture, customer engineering, or sales engineering.
- Experience developing demo environments, sample apps, or repeatable platform enablement assets.
- Strong storytelling ability when explaining data product value, domain capabilities, and Marketplace patterns.
Title: Lead Software Engineer - AI Application Platform
Mode of interview 1 round in person
Location: Must be in Charlotte, NC to work Hybrid Model
Main Skill set: Python, AI and Angular
Description:
Lead Software Engineer - AI Application Platform
The Opportunity
We are seeking a Lead Software Engineer to guide the architectural development and execution of the client, a sophisticated AI-powered application generation platform. This role suits a proven technical leader with deep, hands-on expertise across the full software stack who finds enabling a team to build better software deeply satisfying.
You will shape critical systems, mentor senior and junior developers through complex technical decisions, conduct rigorous code reviews across multiple technology domains, and directly influence the platform's trajectory through strategic engineering leadership.
This is for someone who:
- Engages thoughtfully when a junior developer asks targeted architectural questions—because you see an opportunity to shape how someone thinks about systems
- Takes time to explain subtle type-safety issues in code review, understanding that feedback is a teaching moment
- Can present architecture clearly to executives and confidently explain both what we're building and why it matters
- Finds more energy in the code your team ships than in the code you write individually
- Has proven depth across the full stack and a track record of developing engineers into stronger contributors
This is not a single-language codebase. The role requires the ability to make informed decisions on TypeScript design patterns, Python FastAPI architecture, AWS security posture, and Terraform state management in context with one another.
The Platform Challenge
The client is fundamentally a Platform-as-a-Service (PaaS) for dynamic application generation. This differs from building a traditional SaaS product. Rather than building one application, you're building infrastructure that enables users to build their own applications.
What this means architecturally:
- Dynamic Content Generation at Scale: Unlike traditional development where code is fixed, AppGen generates JSON form schemas, validation rules, and UI layouts on demand. The FormBuilder component doesn't know what fields will exist until runtime. The layout engine renders user-designed screens from configuration, not hardcoded templates.
- Multi-Tenant Isolation & Data Segregation: Each user gets their own generated app, potentially deployed to their own AWS environment. The architecture must account for data isolation, namespace management, and cross-tenant security considerations.
- User-Defined Data Structures: Traditional applications are built with predetermined database schemas. AppGen works differently—form structures, field types, and validation rules emerge from user conversations with Claude. This brings engineering challenges: How do you safely execute validation logic that users define? When users modify existing forms that have thousands of submissions, how do you maintain backward compatibility? How do you version schemas?
- Content Rendering, Not Code Generation: Unlike traditional no-code platforms where users drag-and-drop to build, AppGen uses AI instead. Users chat with Claude, Claude generates a form schema, and your platform renders that schema reliably across diverse field types, validation patterns, and workflows. The system renders configurations for immediate use, rather than generating code for later deployment.
Experience that directly transfers:
- You've contributed to or led development of low-code/no-code platforms (visual builders, workflow engines, configuration-driven systems)
- You've worked on SaaS platforms with multi-tenant architecture and understand isolation strategies, rate limiting, and per-customer customization
- You've built dynamic rendering systems that handle unknown/arbitrary schemas at runtime
- You've addressed the unique challenges of treating data configurations as user-created content (form builders, report designers, automation workflows)
- You understand the difference between platform infrastructure and applications built on that infrastructure—and the architectural implications of each
Core Responsibilities
1. Technical Architecture & Systems Thinking (40%)
- Shape architectural decisions across the full stack: How should the component layer handle dynamically generated forms? What's the right approach to validate complex cross-field dependencies in the FormBuilder? What separation of concerns makes sense between the Generator Lambda and the Parent Backend?
- Guide architecture discussions: Help senior developers think through design trade-offs. Should we use NgRx or Angular signals for this feature? When does a new Lambda function become worthwhile given cold-start costs?
- Identify and address system-wide bottlenecks: Work across layers to improve performance. Explore Lambda cold-start optimization, RDS query efficiency, and DynamoDB access patterns.
- Establish patterns and guide consistency: Define coding conventions that work across Python, TypeScript, and Terraform. Help new team members understand the reasoning behind architectural choices.
- What this looks like in practice: You're able to justify architectural decisions with technical reasoning. When someone questions an approach, you can explain the trade-offs you considered. You can write code in multiple languages to validate an approach if needed.
2. Code Review & Technical Guidance (30%)
- Full-stack PR reviews: Review Python FastAPI endpoints and Angular components with equal depth, understanding how they interact.
- Deep technical review: Catch issues thoughtful code review can surface:
- RxJS Observable lifecycle and potential memory patterns in Angular
- Query efficiency and data loading patterns in SQLAlchemy
- Terraform module organization and state management implications
- Type safety and TypeScript coverage gaps
- AWS security and IAM configurations
- Educational feedback: Your code reviews help the team learn. When you identify an issue, reviewees understand not just what changed, but how to think about similar problems in the future.
- Define quality expectations: Work with the team to establish what \"production-ready\" means for this platform and support consistent application of those standards.
- What this requires: Experience reviewing code across teams and multiple languages. You know how to write feedback that resonates—clear, constructive, and focused on helping people improve.
3. Mentorship & Team Development (20%)
- Expand specialist capabilities: Help backend specialists learn to contribute to the forms-engine. Support frontend experts in understanding FastAPI patterns.
- Accelerate junior developers: Pair on complex problems. Explain the reasoning behind patterns like DataState. Connect architectural choices to implementation details and performance implications.
- Identify and address gaps: Recognize when someone is struggling with a technology and provide targeted support—training, pair programming, or guidance through architectural decisions.
- Create growth opportunities: Stretch the team into new areas. A backend engineer working on their first Terraform contribution. A frontend specialist implementing an AWS Lambda authorizer.
- What this requires: Genuine investment in people's growth. You've walked developers through major transitions (generalist to specialist, specialist to full-stack, or into new technology areas). You understand that team strength grows when individuals expand their capabilities.
4. Stakeholder Communication & Technical Leadership (10%)
- Explain to diverse audiences: Translate architectural choices and trade-offs for product managers, executives, and business stakeholders. Connect \"optimizing DynamoDB queries\" to \"improving form submission latency by 30%.\"
- Shape technical direction: Contribute the engineering perspective on feasibility, risk, and what unlocks future capabilities.
- Support release confidence: You understand the code changes, comprehend the risks, and know what to monitor. You can stand behind releases.
Required Qualifications
Technical Skills
Frontend (Production Experience)
- 5+ years of Angular (including handling version migrations, optimizing change detection, and guiding teams through reactive patterns)
- Strong TypeScript skills with generics, discriminated unions, and strict mode
- RxJS depth: You understand hot vs. cold observables, unsubscription patterns, and can identify potential memory issues in reviews
- NgRx state management: You've designed stores at scale, optimized selectors, and evaluated architectural implications
- CSS Grid & Responsive Design: You can assess component hierarchy and layout decisions
- Material Design: You've worked within it and know when and how to extend it
Backend (Production Experience)
- 5+ years of Python (async/await, type hints, data modeling)
- FastAPI production experience: session management, dependency injection, middleware
- SQL and ORMs (SQLAlchemy): You write efficient queries and review them critically
- AWS services: Understanding of Lambda behavior, IAM least-privilege patterns, VPC networking
- REST API design: Versioning, error handling, idempotency
- Testing frameworks: pytest, testing st
Remote working/work at home options are available for this role.