Define An Array In Data Structure Jobs in Usa
34,165 positions found — Page 9
Loloi Rugs is a leading textile brand that designs and crafts rugs, pillows, and throws for the thoughtfully layered home. Family-owned and led since 2004, Loloi is growing more quickly than ever. To date, we’ve expanded our diverse team to hundreds of employees, invested in multiple distribution facilities, introduced thousands of products, and earned the respect and business of retailers and designers worldwide. A testament to our products and our team, Loloi has earned the ARTS Award for “Best Rug Manufacturer” in 2010, 2011, 2015, 2016, 2018, 2023, and 2025.
Security Advisory: Beware of Frauds
Protect yourself from potential fraud and verify the authenticity of any job offer you receive from Loloi. Rest assured that we never request payment or demand any sensitive personal information, such as bank details or social security numbers, at any stage of the recruiting process. To ensure genuine communication, our recruiters will solely reach out to applicants using an @ email address. Your security is of paramount importance to us at Loloi, and we are committed to maintaining a safe and trustworthy hiring experience for all candidates.
We are building a Business Operations Center of Excellence, and we need a Product Data Analyst to serve as the "Guardian of the Golden Record." In this role, you are the absolute owner of product data integrity as it relates to the digital customer experience. You ensure that every item we sell is accurately represented across every touchpoint—from our ERP and PIM to our website storefront and marketing feeds. This is not a data entry role; it is a high-impact technical logic and investigation role. You will work directly with our Data Platform and Software Engineering teams to define business rules, audit data health via complex SQL, and troubleshoot data transmission errors before they impact the customer.
Responsibilities
- Storefront Governance: Serve as the absolute owner of product data integrity within the PIM. Ensure that all storefront-critical attributes (pricing, dimensions, weights, image links) are accurate and standardized for a seamless customer experience.
- Technical Data Auditing: Write and run complex SQL queries against our centralized database to identify anomalies, "orphan" records, and data hygiene issues that need resolution. You will be expected to query across multiple schemas to validate data consistency between systems.
- Feed Logic & Mapping: You will manage the logic of how data translates from our PIM to external endpoints. You will ensure that our products appear correctly on Google Shopping, Meta, Amazon, and other marketplaces by managing feed rules and mapping definitions.
- API Payload Analysis: You will act as the first line of defense for data transmission errors. If a product isn't showing up on the site, you will review the JSON/XML response bodies to determine if it is a data payload error or a software code bug.
- Cross-Functional Impact Analysis: You will act as the gatekeeper for data changes, predicting downstream impacts (e.g., "If Merchandising changes this Category Name, it will break the Finance reporting filter").
- Hygiene Logic Definition: You will partner with our IT/Database team to define automated health checks. You identify the "rot" (bad data patterns), and they implement the database constraints to stop it.
What You Will NOT Do (The Boundaries)
- No Web Development: You are not a Front-End Developer. You do not write HTML, CSS, or React code. You ensure the data powering those components is 100% accurate.
- No Manual Data Entry: Your job is not to copy-paste descriptions. You build the systems, bulk processes, and logic that ensure data quality at scale.
- No Database Administration: You do not manage server uptime or schema changes (IT owns this). You own the quality of the records inside the database.
Intersection with Technical Teams
- With IT (Database Mgmt): IT owns the infrastructure and schema; you own the quality of the data within it. When you identify a systemic issue (e.g., "5,000 orphan records"), you partner with IT to implement the technical fix (scripts/constraints).
- With Software Engineering (Commerce): If a product is missing from the site, you check the data payload. If the data is correct, you hand off to Engineering, confirming it is a code/caching bug rather than a data error.
Experience, Skills, & Ability Requirements
- 5-8 years of experience in Data Management, PIM Administration, or technical eCommerce Operations.
- SQL Proficiency: You are comfortable writing queries beyond simple SELECT *. You should be proficient with CTEs (Common Table Expressions), Window Functions (e.g., Rank, Lead/Lag), Subqueries, and complex Joins to act as a forensic data investigator.
- API Fluency: You can read and understand JSON and XML. You know what a valid payload looks like and can spot formatting errors or missing keys.
- Data Manipulation: You are an expert at handling large datasets (CSVs, Excel) and understand data types, formatting standards, and normalization concepts.
- You love hunting down the root cause of an error. You don't just fix the wrong price; you find out why the price was wrong and build a rule to stop it from happening again.
- You have high standards for accuracy. You understand that a wrong weight in the system means a financial loss on shipping for the business.
Bonus Points (Nice-to-Haves)
- Familiarity with Visio/Lucidchart to visualize data flows.
- Ability to build simple dashboards in Tableau to track data health scores.
- Basic familiarity with Python or R for data manipulation.
What We Offer
- Health, dental, and vision benefits
- Paid parental leave
- 401(k) with employer match
- A culture of meritocracy that fosters ongoing growth opportunities
- A stable, growing family-owned company that looks after its employees
Loloi Rugs does not discriminate on the basis of race, sex, color, religion, age, national origin, marital status, disability, veteran status, genetic information, sexual orientation, gender identity or any other reason prohibited by law in provision of employment opportunities and benefits. We seek a diverse pool of applicants and consider all qualified candidates regardless of race, ancestry, color, gender identity or expression, sexual orientation, religion, national origin, citizenship, disability, Veteran status, marital status, or any other protected status. If you have a special need or disability that requires accommodation, please let us know.
Job Title – Lead Data Engineer
Please note this role is not able to offer visa transfer or sponsorship now or in the future
About the role
As a Lead Data Engineer, you will make an impact by designing, building, and operating scalable, cloud‑native data platforms supporting batch and streaming use cases, with strong focus on governance, performance, and reliability. You will be a valued member of the Data Engineering team and work collaboratively with cross‑functional engineering, cloud, and architecture stakeholders.
In this role, you will:
- Design, build, and operate scalable cloud‑native data platforms supporting batch and streaming workloads with strong governance, performance, and reliability.
- Develop and operate data systems on AWS, Azure, and GCP, designing cloud‑native, scalable, and cost‑efficient data solutions.
- Build modern data architectures including data lakes, data lakehouses, and data hubs, with strong understanding of ingestion patterns, data governance, data modeling, observability, and platform best practices.
- Develop data ingestion and collection pipelines using Kafka and AWS Glue; work with modern storage formats such as Apache Iceberg and Parquet.
- Design and develop real‑time streaming pipelines using Kafka, Flink, or similar streaming frameworks, with understanding of event‑driven architectures and low‑latency data processing.
- Perform data transformation and modeling using SQL‑based frameworks and orchestration tools such as dbt, AWS Glue, and Airflow, including Slowly Changing Dimensions (SCD) and schema evolution.
- Use Apache Spark extensively for large‑scale data transformations across batch and streaming workloads.
Work model
We believe hybrid work is the way forward as we strive to provide flexibility wherever possible. Based on this role’s business requirements, this is a hybrid position requiring 4 days a week in a client or Cognizant office in Atlanta, GA. Regardless of your working arrangement, we are here to support a healthy work-life balance though our various wellbeing programs.
The working arrangements for this role are accurate as of the date of posting. This may change based on the project you’re engaged in, as well as business and client requirements. Rest assured; we will always be clear about role expectations.
What you need to have to be considered
- Hands‑on experience developing and operating data systems on AWS, Azure, and GCP.
- Proven ability to design cloud‑native, scalable, and cost‑efficient data solutions.
- Experience building data lakes, data lakehouses, and data hubs with strong understanding of ingestion patterns, governance, modeling, observability, and platform best practices.
- Expertise in data ingestion and collection using Kafka and AWS Glue, with experience in Apache Iceberg and Parquet.
- Strong experience designing and developing real‑time streaming pipelines using Kafka, Flink, or similar streaming frameworks.
- Deep expertise in data transformation and modeling using SQL‑based frameworks and orchestration tools including dbt, AWS Glue, and Airflow, with knowledge of SCD and schema evolution.
- Extensive experience using Apache Spark for large‑scale batch and streaming data transformations.
These will help you stand out
- Experience with event‑driven architectures and low‑latency data processing.
- Strong understanding of schema evolution, SCD modeling, and modern data modeling concepts.
- Experience with Apache Iceberg, Parquet, and modern ingestion/storage patterns.
- Strong knowledge of observability, governance, and platform best practices.
- Ability to partner effectively with cloud, architecture, and engineering teams.
Salary and Other Compensation:
Applications will be accepted until March 17, 2025.
The annual salary for this position is between $81,000 - $135,000, depending on experience and other qualifications of the successful candidate.
This position is also eligible for Cognizant’s discretionary annual incentive program, based on performance and subject to the terms of Cognizant’s applicable plans.
Benefits: Cognizant offers the following benefits for this position, subject to applicable eligibility requirements:
- Medical/Dental/Vision/Life Insurance
- Paid holidays plus Paid Time Off
- 401(k) plan and contributions
- Long‑term/Short‑term Disability
- Paid Parental Leave
- Employee Stock Purchase Plan
Disclaimer: The salary, other compensation, and benefits information is accurate as of the date of this posting. Cognizant reserves the right to modify this information at any time, subject to applicable law.
OZ – Databricks Architect/ Senior Data Engineer
Note: Only applications from U.S. citizens or lawful permanent residents (Green Card Holders) will be considered.
We believe work should be innately rewarding and a team-building venture. Working with our teammates and clients should be an enjoyable journey where we can learn, grow as professionals, and achieve amazing results. Our core values revolve around this philosophy. We are relentlessly committed to helping our clients achieve their business goals, leapfrog the competition, and become leaders in their industry. What drives us forward is the culture of creativity combined with a disciplined approach, passion for learning & innovation, and a ‘can-do’ attitude!
What We're Looking For:
We are seeking a highly experienced Databricks professional with deep expertise in data engineering, distributed computing, and cloud-based data platforms. The ideal candidate is both an architect and a hands-on engineer who can design scalable data solutions while actively contributing to development, optimization, and deployment.
This role requires strong technical leadership, a deep understanding of modern data architectures, and the ability to implement best practices in DataOps, performance optimization, and data governance.
Experience with modern AI/GenAI-enabled data platforms and real-time data processing environments is highly desirable.
Position Overview:
The Databricks Senior Data Engineer will play a critical role in designing, implementing, and optimizing enterprise-scale data platforms using the Databricks Lakehouse architecture. This role combines architecture leadership with hands-on engineering, focusing on building scalable, secure, and high-performance data pipelines and platforms. The ideal candidate will establish coding standards, define data architecture frameworks such as the Medallion Architecture, and guide the end-to-end development lifecycle of modern data solutions.
This individual will collaborate with cross-functional stakeholders, including data engineers, BI developers, analysts, and business leaders, to deliver robust data platforms that enable advanced analytics, reporting, and AI-driven decision-making.
Key Responsibilities:
- Architecture & Design: Architect and design scalable, reliable data platforms and complex ETL/ELT and streaming workflows for the Databricks Lakehouse Platform (Delta Lake, Spark).
- Hands-On Development: Write, test, and optimize code in Python, PySpark, and SQL for data ingestion, transformation, and processing.
- DataOps & Automation: Implement CI/CD, monitoring, and automation (e.g., with Azure DevOps, DBX) for data pipelines.
- Stakeholder Collaboration: Work with BI developers, analysts, and business users to define requirements and deliver data-driven solutions.
- Performance Optimization: Tune delta tables, Spark jobs, and SQL queries for maximum efficiency and scalability.
- GenAI Applications Development: It is a big plus to have experience in GenAI application development
Requirements:
- 8+ years of experience in data engineering, with strong hands-on expertise in Databricks and Apache Spark.
- Proven experience designing and implementing scalable ETL/ELT pipelines in cloud environments.
- Strong programming skills in Python and SQL; experience with PySpark required.
- Hands-on experience with Databricks Lakehouse, Delta Lake, and distributed data processing.
- Experience working with cloud platforms such as Microsoft Azure, AWS, or GCP (Azure preferred).
- Experience with CI/CD pipelines, Git, and DevOps practices for data engineering.
- Strong understanding of data architecture, data modeling, and performance optimization.
- Experience working with cross-functional teams to deliver enterprise data solutions.
- Tackles complex data challenges, ensuring data quality and reliable delivery.
Qualifications:
- Bachelor’s degree in Computer Science, Information Technology, Engineering, or a related field.
- Experience designing enterprise-scale data platforms and modern data architectures.
- Experience with data integration tools such as Azure Data Factory or similar platforms.
- Familiarity with cloud data warehouses such as Databricks, Snowflake, or Azure Fabric.
- Experience supporting analytics, reporting, or AI/ML workloads is highly desirable.
- Databricks, Azure, or cloud certifications are preferred.
- Strong problem-solving, communication, and technical leadership skills.
Technical Proficiency in:
- Databricks, Apache Spark, PySpark, Delta Lake
- Python, SQL, Scala (preferred)
- Cloud platforms: Azure (preferred), AWS, or GCP
- Azure Data Factory, Kafka, and modern data integration tools
- Data warehousing: Databricks, Snowflake, or Azure Fabric
- DevOps tools: Git, Azure DevOps, CI/CD pipelines
- Data architecture, ETL/ELT design, and performance optimization
What You’re Looking For:
Join a fast-growing organization that thrives on innovation and collaboration. You’ll work alongside talented, motivated colleagues in a global environment, helping clients solve their most critical business challenges. At OZ, your contributions matter – you’ll have the chance to be a key player in our growth and success. If you’re driven, bold, and eager to push boundaries, we invite you to join a company where you can truly make a difference.
About Us:
OZ is a 28-year-old global technology consulting, services, and solutions leader specializing in creating business-focused solutions for our clients by leveraging disruptive digital technologies and innovation.
OZ is committed to creating a continuum between work and life by allowing people to work remotely. We offer competitive compensation and a comprehensive benefits package. You’ll enjoy our work style within an incredible culture. We’ll give you the tools you need to succeed so you can grow and develop with us and become part of a team that lives by its core values.
Company Description
Liberty Bankers Insurance Group, headquartered in Dallas, Texas, includes Liberty Bankers Life Insurance Company, Capitol Life Insurance Company, and American Benefit Life Insurance Company, with 115 years of experience in serving insurance needs. The organization values integrity, dignity, and respect in interactions with customers and team members, fostering a culture that prioritizes trust and excellent service. Known for its customer-centric approach, the company is committed to building long-lasting relationships based on transparency. Liberty Bankers Insurance Group provides diverse insurance and financial products, supported by a dedicated team driven to deliver prompt and accurate service.
Role Description
This is a full-time, on-site Structured Credit Analyst role based in Dallas, TX. The Structured Products Analyst will analyze, recommend and trade ABS including some non-agency RMBS, and CMBS for purchase or sale to further diversify our portfolio, manage risk, increase the yield, and increase our NAIC1 holdings.
Responsibilities
- Analyze and recommend new issue and secondary market ABS/RMBS/CMBS for purchase to maximize risk-adjusted returns.
- Continually review current portfolio holdings for increased risk and recommend sales.
- Recommend structured finance portfolio changes to reduce risk, increase yield, further diversify and increase NAIC1 holdings.
- Document research and present it at the monthly and quarterly meetings and as needed.
Qualifications and Skills
- College degree required, MBA and/or CFA a plus.
- At least 2 years’ experience in analyzing Structured Finance products.
- Proficient in Bloomberg, Intex, Microsoft Office, and preferably KCP and Costar.
- Proficient in Mathematics and Finance.
- Ability to make and support decisions in a prudent and timely manner.
Main Duties / Required:
- Knowledge and use of basic telecom hand tools.
- Must understand customer service.
- Clear understanding of job safety requirements.
- Be able to read and understand floor plans.
- Should be familiar on wiring schemes and wiring testing.
- Should be able to pull all types of low voltage cable.
- Should understand and be capable in performing field terminations and labeling.
- Reports to Operations Manager and take daily directions from Technician, Technician II, Lead Technicians, Senior Technicians and Advance Senior Technicians.
- Capable of maintaining orderly paperwork, capable of running service jobs.
- Possess skill to layout MDF and IDF closets, be able to dress all types of cable, and perform all type of terminations.
- Capable of working in Data Centers
- Shall be able to install ladder racking and seismic bracing both above and under raise floor.
- Basic understanding of both copper and fiber cable testing and troubleshooting.
- Read and understand blueprints and design document
- Dress and furcate fiber trunks for splicing
- Maintaining orderly paperwork and running service
- Fusion Splice including Ribbon/Single OSP/ISP
- Install, connect, and decom network equipment
- Operate DSX 5000 tester/OTDR Tester
- Program testers
- Download test results to Linkware/Linkware Live
- Save test results, verify, and submit to customer
- Create mass labels and apply per Portmap
- Differentiate live cables from decom cable
- Copper testing and troubleshooting
- Conduct Service Swaps of live networking devices
- Understand "FIM" database and operate scanners
PHYSICAL REQUIREMENTS
- Primarily walking, standing, and bending for extended periods with some sitting.
- Ability to communicate effectively with verbal, written, visual and listening skills.
- Dexterity of hands and fingers to operate any required equipment as well as to operate a computer keyboard, mouse, and other technical instruments.
- Able to lift and carry heavy equipment, up to 50 pounds.
- Ability to pull cables.
- Ability to climb ladders an
Nice to have Skills:
Key Skills / Words: (at least 6)
Data Center
Technician
Decommission
Splicing
Low Voltage
“Let goodness, fairness, and most importantly, love prevails in business; profits will inevitably follow.” – NK Chaudhary, founder
What we do for our team members:
- Comprehensive Benefits: Company Paid Holidays, PTO, Parental Involvement Leave, Maternity/Paternity Leave, EAP, No Cost Employee Medical Plan, Vision, Dental, and Company Paid Life Insurance. We also include a match on retirement (401K/Roth).
- Career Development: We're committed to providing growth for career development within the company, supporting our team members' aspirations with a well-defined succession plan that includes a variety of training and development opportunities.
- Pet-Friendly Workplace: We welcome your furry friends! Our 'Bring Your Dogs to Work' policy creates a pet-friendly atmosphere, allowing our team members to enjoy the companionship of their dogs during the workday.
- Wellness Support: Not only do we support an active lifestyle with our on-site basketball court and yoga studio, but we host quarterly mental health events to assist in creating a well-rounded work-life harmony for our team members.
- Sustainability Efforts: Reuse, Renew, and Refresh by joining our Green Team! Responsible for harvesting from the organic community garden, donating goods to local pet shelters and schools, creating educational workshops, leading nature walks, and much more, they promote well-being through sustainable practices.
Our Values
Empowerment • Inclusiveness • Responsibility • Progressive
Learn more about our company story here: Jaipur Rugs Foundation
Since 2004, the Jaipur Rugs Foundation has worked to improve the lives of rug-weaving artisans in India. This is done through training, skills development, and social interventions. By focusing on the ideas and solutions that create social value, the Foundation supports the dignity and heritage of these traditional artisans, believing that healthy and sustainable communities are key to the survival of traditional rug weaving. Jaipur Living has made ethical and socially conscious global citizenship the foundation of its business. Through social initiatives and the Jaipur Rugs Foundation, the company supports a supplier ecosystem without a middleman of more than 40,000 artisans in 700 villages across India by providing them with a livable wage, access to health care, leadership education, and opportunities for personal growth and development. Combining time-honored techniques and of-the-moment trends, every Jaipur Living product is as ethically and responsibly made as it is beautiful.
Learn more about the Jaipur Rugs Foundation here: are a fast-growing, design-led B2B home décor and textiles brand with big ambitions. Over the last 12 months, we have revolutionized our technical foundation, investing in Microsoft Dynamics 365 (F&O) and a Microsoft Fabric ecosystem. We are now looking for a seasoned leader to refine our existing infrastructure, optimize our end-to-end data workflows, and bridge the gap between "raw data" and "reliable business intelligence."
This role demands a strong balance of technical depth and operational management. While you must possess expert-level proficiency in data engineering, specifically within the Microsoft Fabric ecosystem and modern data platforms, we also need a leader who is experienced in analytics, data visualization, BI, and translating business needs into analytical solutions. You will be responsible for defining and executing an outcome-based Data & Analytics strategy, building and developing a global team of data engineers, BI developers, and data analysts, and ensuring the company has trusted, scalable, and decision-ready data at every level of the organization. The ideal candidate is a Fabric-certified or Fabric-trained leader, an exceptional communicator, and a proven people manager who can balance hands-on technical depth with strategic leadership.
Key Responsibilities:
Strategic Management & Outcome-Based Delivery
- Tactical Roadmap: Develop and execute a multi-year roadmap that aligns data engineering, BI, and advanced insights with business priorities (e.g., inventory efficiency, margin protection, and growth).
- Process Standardization: Define what “good” looks like for data reliability, documentation, insight quality, and business impact
- Baseline Maturity: Shift the organization from ad-hoc reporting to repeatable, trusted, decision-ready data products
- Advance Automation: Assess the current-state landscape and define a clear path from foundational reporting to automated, predictive analytics.
- Executive Communication: Serve as the single point of accountability for all data and analytics capabilities, translating technical progress into business-relevant implications across the organization
Infrastructure Optimization & Fabric Engineering
- Systemic Optimization: Lead the audit and refinement of the existing Fabric environment (Lakehouse, Pipelines, Notebooks) to improve overall performance, stability, and refresh reliability
- Engineering Standards: Set the "gold standard" for architecture, data modeling, testing, and deployment (CI/CD), ensuring the stack is hardened for enterprise-scale growth
- Reduce Manual Effort: Minimize operational risk by standardizing pipelines, refresh processes, and metric calculations
- Automation & Reliability: Systematically identify and eliminate manual reporting and spreadsheet-based workflows through robust automation in PySpark and Fabric
- Proactive Governance: Establish monitoring, alerting, and exception-handling processes to manage data quality and refresh failures before they impact the business
Analytics & Decision Enablement
- High-Quality BI Delivery: Oversee the design and delivery of visually appealing Power BI dashboards that simplify complexity and adhere to our design-led brand standards
- Metric Governance: Ensure KPI definitions and reporting logic are consistent across the company, acting as the arbiter of "the truth" for business metrics
- Advanced Analytics: Identify and operationalize high-value use cases for predictive analytics (e.g., demand forecasting, product lifecycle analysis) as platform maturity increases
- Business Translation: Partner with business leaders to translate business requirements into scalable, intuitive, impactful analytics solutions
- Business Evolution: Lead the transition from descriptive and diagnostic reporting to forward-looking insights that support planning and decision-making
Global Team Leadership & Talent Development
- People Leadership: Directly lead and develop a 3–5 person global team (primarily based in India), establishing clear roles, accountability, and a high-performance culture
- Skill Development: Create career paths and skill-development plans for engineers and analysts to ensure consistent, high-quality delivery
- Operating Model: Build a scalable offshore capability that delivers at speed while maintaining rigorous standards for code quality and documentation
Skills & Minimum Qualifications:
To perform this job successfully, an individual must be able to perform each essential duty satisfactorily. The requirements listed below are representative of knowledge, skill, and/or ability required. Reasonable accommodation may be made to enable individuals with disabilities to perform essential functions.
- 10+ years of experience in data engineering, analytics, or BI, with director-level scope or equivalent ownership
- Deep hands-on experience with Microsoft Fabric (Lakehouse, Pipelines, Notebooks, semantic models)
- Fabric certification or formal Fabric training strongly preferred
- Strong experience with PySpark and Spark-based transformations
- Strong understanding of Azure data services and modern data architectures
- Exceptional dashboard-development skills using Power BI; portfolio-quality experience preferred
- Strong understanding of data storytelling, executive-ready visualization, and intuitive UI/UX design
- Experience gathering business requirements and translating them into analytical products
- Proven experience leading and developing global / offshore teams
- Strong communicator with the ability to influence at senior levels
- Experience supporting ERP-driven environments; Dynamics 365 preferred
- Ability to juggle strategy, execution, and stakeholder communication simultaneously
Success Measures (First 12–18 Months)
- Strategy Execution: An outcome-based Data & Analytics strategy that is fully operational and tied to business outcomes
- Optimized Infrastructure: A trusted, scalable Fabric platform with significantly reduced manual reporting and 99%+ data availability
- Dashboard Adoption: A suite of high-quality dashboards used daily and weekly by business leaders to drive decision-making
- Team Growth: A high-performing global team with a track record of delivering complex analytics products with speed and precision
Physical Requirements:
- Remaining in a seated position for long periods of time
- Standing is to remain on one’s feet in an upright position without moving about
- The ability to alternate between sitting and standing is present when a worker has the flexibility to choose between sitting or standing as needed when this need cannot be accommodated by schedules breaks and/or lunch period
- Lifting and transporting items that could weight up to 25 pounds
- Entering text or data into a computer by means of a traditional keyboard
- Expressing or exchanging ideas by means of the spoken work to impart oral information to clients and talent and convey detailed spoken instructions to other workers accurately and quickly
- The ability to hear, understand, and distinguish speech and/or other sounds such as in person and telephone
- Clarity of vision to see computer screens and workspace
The Data Engineering Manager is responsible for leading and developing a team of Data Architects and Data Solutions Engineers while actively contributing to hands-on technical projects. This role will manage the data warehouse in Snowflake, engineering automations in Alteryx and/or other solutions, while ensuring efficient project intake and prioritization. The ideal candidate combines strong technical expertise with proven technical leadership skills to drive innovation and operational excellence across the data engineering function.
As a Data Engineering Manager, you will:
- Set the technical strategy for data engineering solutions and data architecture which includes end to end data pipeline strategy, consumption management, project scoping, and data automation.
- Design, develop, and optimize data engineering solutions using Snowflake, DBT, Azure Data Factory, and Alteryx.
- Continuously assess and optimize the data engineering technology stack to ensure scalability, performance, and alignment with industry best practices.
- Implement best practices for data modeling, ETL/ELT processes, and automation.
- Own and maintain the Snowflake data warehouse roadmap and engineering standards.
- Lead data project scoping, prioritization, and resource allocation to ensure timely delivery of data engineering solutions.
- Ensure data integrity, security, and compliance across all engineering solutions.
- Collaborate with IT and rest of data teams to align solutions with enterprise
- Establish documentation and governance standards for data engineering workflows ensuring completeness, audit readiness, and traceability in alignment with enterprise architecture.
- Directly supervise the Data Architecture & Data Engineering team in accordance with Nicolet's policies and applicable laws. Responsibilities include interviewing, hiring, and training employees; planning, assigning, and directing work; appraising performance; coaching, mentoring and development planning; rewarding and disciplining employees; addressing complaints and resolving problems.
Qualifications:
- Bachelor's degree in Computer Science, Data Engineering, Data Analytics or related field.
- 7+ years in data engineering or related data roles required.
- 3+ years in leadership or management positions required.
- Strong technical expertise in Snowflake, DBT, Azure Data Factory and SQL or like systems.
- Familiarity with Alteryx, UiPath, Tableau, Power BI and Salesforce is preferred.
- Ability to design and implement scalable data solutions.
- Excellent leadership, communication, and organizational skills
- Ability to balance hands-on development with team development.
- Must be able to work fully in-office. This position does not allow for remote work.
Benefits:
- Medical, Dental, Vision, & Life Insurance
- 401(k) with a company match
- PT0 & 11 1/2 Paid Holidays
The above statements are intended to describe the general nature and level of work being performed. They are not intended to be construed as an exhaustive list of all responsibilities and skills required for the position.
Equal Opportunity Employer/Veterans/Disabled
About Pinterest:
Millions of people around the world come to our platform to find creative ideas, dream about new possibilities and plan for memories that will last a lifetime. At Pinterest, we're on a mission to bring everyone the inspiration to create a life they love, and that starts with the people behind the product.
Discover a career where you ignite innovation for millions, transform passion into growth opportunities, celebrate each other's unique experiences and embrace theflexibility to do your best work. Creating a career you love? It's Possible.
At Pinterest, AI isn't just a feature, it's a powerful partner that augments our creativity and amplifies our impact, and we're looking for candidates who are excited to be a part of that. To get a complete picture of your experience and abilities, we'll explore your foundational skills and how you collaborate with AI.
Through our interview process, what matters most is that you can always explain your approach, showing us not just what you know, but how you think. You can read more about our AI interview philosophy and how we use AI in our recruiting process here.
Team & Mission
The Privacy & Conversion Data team is responsible for how the company safely and compliantly uses conversion data to power monetization. We build and operate the core privacy infrastructure behind ads reporting and optimization, including controlled data environments, finegrained access controls, centralized privacy rules enforcement, and deidentification pipelines for conversion data. Our mission is to make conversion data privacypreserving by default-centralized, deidentified, auditable, and easy for teams to use, while maintaining high utility for advertisers and staying ahead of an evolving global regulatory landscape.
Role Summary
We're seeking a Staff Engineer to lead the architecture and technical direction for the conversion data privacy platform, spanning both core Conversion Data systems and deidentification for ads reporting. You'll own the endtoend design and evolution of privacycritical pipelines and services, partner closely with Product, Data Science, Legal, and infrastructure teams, and set the technical bar for how we use conversion data safely at scale.
What you'll do:
- Lead the technical strategy and architecture for conversion data privacy across access controls, deidentification, deletion, and privacy rules enforcement, driving toward a centralized, deidentifiedbydefault, automated privacy platform for monetization.
- Design and evolve core privacy infrastructure including controlled environments for sensitive data, finegrained authorization and policy enforcement, and a central policy repository that consistently governs access across major data platforms and query engines.
- Own deidentification pipelines for ads reporting endtoend-from separating sensitive and nonsensitive data, applying deidentification techniques and transformations, and generating privacypreserving datasets, to validating data utility and feeding reporting and analytics surfaces.
- Build and improve privacy frameworks and tooling (for both online and offline workflows) that make safe, compliant conversion data usage simple and selfservice for downstream teams, reducing onboarding friction for new datasets, restrictions, and use cases.
- Drive operational excellence and compliance by defining SLAs, building robust monitoring and alerting (e.g., deidentification quality, optout metrics, data leakages), leading incident response, and developing performant deletion and leakagehandling workflows that meet regulatory and audit requirements.
- Partner crossfunctionally with ads, data, product, legal, and infrastructure stakeholders to translate legal/privacy requirements into technical designs, make clear tradeoffs between privacy and utility, and drive alignment on roadmaps, launches, and policy changes that impact advertisers and users.
- Mentor and uplevel engineers across multiple teams, lead critical design and code reviews in privacysensitive areas, and establish best practices and documentation for privacybydesign, deidentification, and largescale data systems.
What we're looking for:
- BS+ in Computer Science (or related field) or equivalent practical experience.
- 8+ years of professional software engineering experience, with a focus on largescale data systems or distributed systems.
- Strong proficiency building and operating data pipelines and services using Java/Scala/Kotlin or Python, plus SQL; experience with modern big data ecosystems is a plus.
- Experience designing secure, reliable systems and APIs, with solid grounding in data modeling, access control, and performance optimization.
- Meaningful experience in at least one of: privacypreserving data systems (e.g., deidentification, kanonymity), ads measurement/attribution, or largescale analytics/experimentation platforms.
- Proven ability to drive crossteam technical initiatives from design through rollout, working closely with product, data science, and nonengineering partners (e.g., Legal, Compliance).
- Strong communication and leadership skills, with a track record of mentoring engineers, raising engineering standards, and making sound decisions in ambiguous, highimpact problem spaces.
In-Office Requirement Statement:
- We recognize that the ideal environment for work is situational and may differ across departments. What this looks like day-to-day can vary based on the needs of each organization or role.
Relocation Statement:
- This position is not eligible for relocation assistance. Visit our PinFlex page to learn more about our working model.
#LI-REMOTE
#LI-KK6
At Pinterest we believe the workplace should be equitable, inclusive, and inspiring for every employee. In an effort to provide greater transparency, we are sharing the base salary range for this position. The position is also eligible for equity. Final salary is based on a number of factors including location, travel, relevant prior experience, or particular skills and expertise.
Information regarding the culture at Pinterest and benefits available for this position can be found here.
US based applicants only$177,185—$364,795 USDOur Commitment to Inclusion:
Pinterest is an equal opportunity employer and makes employment decisions on the basis of merit. We want to have the best qualified people in every job. All qualified applicants will receive consideration for employment without regard to race, color, ancestry, national origin, religion or religious creed, sex (including pregnancy, childbirth, or related medical conditions), sexual orientation, gender, gender identity, gender expression, age, marital status, status as a protected veteran, physical or mental disability, medical condition, genetic information or characteristics (or those of a family member) or any other consideration made unlawful by applicable federal, state or local laws. We also consider qualified applicants regardless of criminal histories, consistent with legal requirements. If you require a medical or religious accommodation during the job application process, please completethis formfor support.
We're committed to fostering an environment for every teammate that's welcoming, respectful and inclusive, with great opportunity for professional growth.
Find your future with us.787 Airframe in Boeing Commercial Airplanes is hiring for a Senior (Level 5) Structural Analysis Engineer for our Wing, Wing Edges, Empennage, Systems Stress, or Interiors Stress teams to work at our North Charleston, SC location.
At Boeing, our engineers share a passion to redefine what's possible.
To turn dreams into reality.
To bring cutting edge technology to market.
If you are seeking a dynamic, innovative environment, this opportunity is for you! This is a great opportunity to work in the Commercial Airplanes South Carolina Design Center, as a Structural Analysis Engineer.
Primary Responsibilities:Plans and organizes structural analysis work at the program level to assure total compliance with structural integrity and stability requirements.Develops, integrates and documents complex or unique structural requirements to establish the system design.Coordinates with other engineering groups to establish the product's environment.Guides product design and verifies structural integrity by defining requirements for development of analytical methods, finite element models/simulations and other analysis tools to develop the structural environment, characteristics and performance of the product.Works with customers and regulatory agencies to define certification processes that will ensure requirements are met.Reviews and approves certification results.Develops test plans and configurations, supports test execution and analyzes/reports test results to validate and verify systems and components meet requirements and specifications.Defines and organizes test programs to substantiate for customers and regulatory agencies that requirements are satisfied.Develops analytical processes/tools to improve effectiveness, quality and efficiency of the development effort.Investigates emerging technologies to develop future product designs to meet projected requirements.Works under consultative direction.This position is expected to be 100% onsite.The selected candidate will be required to work onsite at the North Charleston, SC location.Basic Qualifications (Required Skills/ Experience):Bachelor of Science degree in Engineering (with a focus on Mechanical, Civil, Aerospace Aeronautical or Material Sciences)12+ years of experience in the structural or aerospace industry9+ years of experience performing structural analysis on metallic structures9+ years of experience performing structural analysis on composite material systems (if seeking to work in the Wing, Wing Edges, or Empennage team)9+ years of experience working on primary aircraft structures or airplane systems secondary structures or interiors secondary structures3+ years of experience collaborating and performing oversight on projects with international partners/suppliers3+ years of experience supporting test development and executionAbility to assist, advice, and check classical hand analysis methodsAbility to perform FEA and understand/utilize/validate resultsPrior experience with aircraft, heavy structures, hardware systems or interiors manufacturing/production systems Preferred Qualifications (Desired Skills/Experience):12+ years of related work experience or an equivalent combination of education and experience5+ years of experience planning, organizing, and executing program level structural analysis3+ years of experience developing, integrating, and documenting complex or unique structural requirements3+ years of Experience with program level cross-functional integration3+ years of experience working with customer and regulatory agencies to define certification strategy that will ensure requirements are metCurrent or prior DER/AR/E-UMExperience leading or performing structural analysis for a full lifecycle of a structureConflict of Interest: Successful candidates for this job must satisfy the Company's Conflict of Interest (COI) assessment processDrug Free Workplace:Boeing is a Drug Free Workplace where post offer applicants and employees are subject to testing for marijuana, cocaine, opioids, amphetamines, PCP, and alcohol when criteria is met as outlined in our policies.Shift:This role is for 1st shift, however, there may be additional shift requirements to support program objectivesPay Range Summary:At Boeing, we strive to deliver a Total Rewards package that will attract, engage and retain the top talent.Elements of the Total Rewards package include competitive base pay and variable compensation opportunities.The Boeing Company also provides eligible employees with an opportunity to enroll in a variety of benefit programs, generally including health insurance, flexible spending accounts, health savings accounts, retirement savings plans, life and disability insurance programs, and a number of programs that provide for both paid and unpaid time away from work.The specific programs and options available to any given employee may vary depending on eligibility factors such as geographic location, date of hire, and the applicability of collective bargaining agreements.Please note that the salary information shown below is a general guideline only.Pay is based upon candidate experience and qualifications, as well as market and business considerations.Summary pay range for Senior (Level 5): $154,700 – $209,300Applications for this position will be accepted until Apr.
16, 2026Export Control Requirements: This position must meet U.S.
export control compliance requirements.
To meet U.S.
export control compliance requirements, a "U.S.
Person" as defined by 22 C.F.R.
§120.62 is required."U.S.
Person" includes U.S.
Citizen, U.S.
National, lawful permanent resident, refugee, or asylee.Export Control Details: US based job, US Person requiredEducation Bachelor's Degree or Equivalent RequiredRelocation This position offers relocation based on candidate eligibility.Visa Sponsorship Employer will not sponsor applicants for employment visa status.Shift This position is for 1st shiftEqual Opportunity Employer:Boeing is an Equal Opportunity Employer.
Employment decisions are made without regard to race, color, religion, national origin, gender, sexual orientation, gender identity, age, physical or mental disability, genetic factors, military/veteran status or other characteristics protected by law.
Join our Pleasanton, CA Team!
Interventional/Structural Cardiologist
Salary Range: $550,000 annual base guarantee (2 years) + Productivity & Performance Incentives + Sign-On + Relocation
Requirements to Apply
- Medical degree (MD or DO), residency, and any applicable fellowship completed at U.S. accredited training programs
- Board Certified or Board Eligible in Cardiovascular Disease
- Fellowship training in Interventional Cardiology
- Active, unrestricted California medical license (or ability to obtain)
- Current DEA registration
Preferred
- Interest in developing Structural Heart services
- Experience in community-based cardiology practice
Interventional Cardiologist Job in Pleasanton, CA – Stanford-Affiliated Community Practice – Structural Heart Growth Opportunity
Job Overview
This full-time Interventional Cardiology opportunity offers the ability to practice in a community-based setting affiliated with a nationally recognized academic health system. The clinical team includes four Interventional Cardiologists, four Non-Invasive Cardiologists, and two Electrophysiologists providing comprehensive cardiovascular care.
The practice is based in Pleasanton with satellite clinics in Castro Valley and Emeryville. Hospital coverage is provided at a local Stanford-affiliated hospital. Call responsibilities are distributed equally among physicians.
Advanced cardiac diagnostic services include echocardiography, cardiac MRI, coronary CTA, nuclear imaging, multi-modality stress testing, arrhythmia monitoring, and vascular imaging. There are opportunities to expand and develop a Structural Heart program. Providers benefit from robust ancillary support staff and collaboration with specialists within and outside the broader academic system.
EPIC EMR is utilized.
What Are the Benefits?
- 2-year base salary guarantee of $550,000
- wRVU-based productivity incentive bonus
- Up to 10% annual performance incentives
- Sign-On Bonus
- Relocation Assistance
- 401(k) Safe Harbor and profit-sharing contributions
- Comprehensive health plans (including $0 premium option)
- Dental, vision, life, and disability coverage
- Full malpractice coverage with prior acts
- PTO including paid holidays and extended sick leave
- CME allowance and paid CME time
- Gym membership and cell phone reimbursement
Where?
Pleasanton, located in the Tri-Valley region of Northern California, offers a high quality of life with top-rated schools, scenic parks, and a vibrant downtown. Situated between San Francisco and Silicon Valley, the area provides suburban comfort with easy access to major metropolitan amenities, fine dining, cultural attractions, and outdoor recreation.
Who Are We?
We are a physician-led, physician-managed multispecialty medical group committed to delivering clinical excellence, innovation, and collaborative care. In partnership with a leading academic medical institution, our mission is to advance precision health and wellness while serving our communities with integrity and compassion.