Luna Data Solutions Inc Jobs in Usa
11,521 positions found — Page 2
Job Summary:
Our client is seeking a Data Steward to join their team! This position is located Hybrid in Creve Coeur, Missouri.
Duties:
- Understand business capability needs and processes as they relate to IT solutions through partnering with Product Managers and business and functional IT stakeholders
- Participate in data scraping, data curation and data compilation efforts
- Ensure high quality of the data to end users
- Ensure high quality of the inhouse data via data stewardship
- Implement and utilize data solutions for data analysis and profiling using a variety of tools such as SQL, Postman, R, or Python and following the team’s established processes and methodologies
- Collaborate with other data stewards and engineers within the team and across teams on aligning delivery dates and integration efforts
- Define data quality rules and implement automated monitoring, reporting, and remediation solutions
- Coordinate intake and resolution of data support tickets
- Support data migration from legacy systems, data inserts and updates not supported by applications
- Partner with the Data Governance organization to ensure data is secured and access is being managed appropriately
- Identify gaps within existing processes and capable of creating new documentation templates to improve the existing processes and procedures
- Create mapping documents and templates to improve existing manual processes
- Perform data discoveries to understand data formats, source systems, etc. and engage with business partners in this discovery process
- Help answer questions from the end-users and coordinate with technical resources as needed
- Build prototype SQL and continuously engage with end consumers with enhancements
Desired Skills/Experience:
- Bachelor's Degree in Computer Science, Engineering, Science, or other related field
- Applied experience with modern engineering technologies and data principles, for instance: Big Data Cloud Compute, NoSQL, etc..
- Applied experience with querying SQL and/orNoSQL databases
- Experience in designing data catalogs, including data design, metadata structures, object relations, catalog population, etc.
- Data Warehousing experience
- Strong written and verbal communication skills
- Comfortable balancing demands across multiple projects / initiatives
- Ability to identify gaps in requirements based on business subject matter domain expertise
- Ability to deliver detailed technical documentation
- Expert level experience in relevant business domain
- Experience managing data within SAP
- Experience managing data using APIs
- Big Query experience
Benefits:
- Medical, Dental, & Vision Insurance Plans
- Employee-Owned Profit Sharing (ESOP)
- 401K offered
The approximate pay range for this position starting at $104,000 - $115,000+ Please note that the pay range provided is a good faith estimate. Final compensation may vary based on factors including but not limited to background, knowledge, skills, and location. We comply with local wage minimums.
At KellyMitchell, our culture is world class. We’re movers and shakers! We don’t mind a bit of friendly competition, and we reward hard work with unlimited potential for growth. This is an exciting opportunity to join a company known for innovative solutions and unsurpassed customer service. We're passionate about helping companies solve their biggest IT staffing & project solutions challenges. As an employee-owned, women-led organization serving Fortune 500 companies nationwide, we deliver expert service at a moment's notice.
By applying for this job, you agree to receive calls, AI-generated calls, text messages, or emails from KellyMitchell and its affiliates, and contracted partners. Frequency varies for text messages. Message and data rates may apply. Carriers are not liable for delayed or undelivered messages. You can reply STOP to cancel and HELP for help. You can access our privacy policy at
About Wakefern
Wakefern Food Corp. is the largest retailer-owned cooperative in the United States and supports its co-operative members' retail operations, trading under the ShopRite®, Price Rite®, The Fresh Grocer®, Dearborn Markets®, and Gourmet Garage® banners.
Employing an innovative approach to wholesale business services, Wakefern focuses on helping the independent retailer compete in a big business world. Providing the tools entrepreneurs need to stay a step ahead of the competition, Wakefern’s co-operative members benefit from the company’s extensive portfolio of services, including innovative technology, private label development, and best-in-class procurement practices.
The ideal candidate will have a strong background in designing, developing, and implementing complex projects, with focus on automating data processes and driving efficiency within the organization. This role requires a close collaboration with application developers, data engineers, data analysts, data scientists to ensure seamless data integration and automation across various platforms. The Data Integration & AI Engineer is responsible for identifying opportunities to automate repetitive data processes, reduce manual intervention, and improve overall data accessibility.
Essential Functions
- Participate in the development life cycle (requirements definition, project approval, design, development, and implementation) and maintenance of the systems.
- Implement and enforce data quality and governance standards to ensure the accuracy and consistency.
- Provide input for project plans and timelines to align with business objectives.
- Monitor project progress, identify risks, and implement mitigation strategies.
- Work with cross-functional teams and ensure effective communication and collaboration.
- Provide regular updates to the management team.
- Follow the standards and procedures according to Architecture Review Board best practices, revising standards and procedures as requirements change and technological advancements are incorporated into the >tech_ structure.
- Communicates and promotes the code of ethics and business conduct.
- Ensures completion of required company compliance training programs.
- Is trained – either through formal education or through experience – in software / hardware technologies and development methodologies.
- Stays current through personal development and professional and industry organizations.
Responsibilities
- Design, build, and maintain automated data pipelines and ETL processes to ensure scalability, efficiency, and reliability across data operations.
- Develop and implement robust data integration solutions to streamline data flow between diverse systems and databases.
- Continuously optimize data workflows and automation processes to enhance performance, scalability, and maintainability.
- Design and develop end-to-end data solutions utilizing modern technologies, including scripting languages, databases, APIs, and cloud platforms.
- Ensure data solutions and data sources meet quality, security, and compliance standards.
- Monitor and troubleshoot automation workflows, proactively identifying and resolving issues to minimize downtime.
- Provide technical training, documentation, and ongoing support to end users of data automation systems.
- Prepare and maintain comprehensive technical documentation, including solution designs, specifications, and operational procedures.
Qualifications
- A bachelor's degree or higher in computer science, information systems, or a related field.
- Hands-on experience with cloud data platforms (e.g., GCP, Azure, etc.)
- Strong knowledge and skills in data automation technologies, such as Python, SQL, ETL/ELT tools, Kafka, APIs, cloud data pipelines, etc.
- Experience in GCP BigQuery, Dataflow, Pub/Sub, and Cloud storage.
- Experience with workflow orchestration tools such as Cloud Composer or Airflow
- Proficiency in iPaaS (Integration Platform as a Service) platforms, such as Boomi, SAP BTP, etc.
- Develop and manage data integrations for AI agents, connecting them to internal and external APIs, databases, and knowledge sources to expand their capabilities.
- Build and maintain scalable Retrieval-Augmented Generation (RAG) pipelines, including the curation and indexing of knowledge bases in vector databases (e.g., Pinecone, Vertex AI Vector Search).
- Leverage cloud-based AI/ML platforms (e.g., Vertex AI, Azure ML) to build, train, and deploy machine learning models on a scale.
- Establish and enforce data quality and governance standards for AI/ML datasets, ensuring the accuracy, completeness, and integrity of data used for model training and validation.
- Collaborate closely with data scientists and machine learning engineers to understand data requirements and deliver optimized data solutions that support the entire machine learning lifecycle.
- Hands-on experience with IBM DataStage and Alteryx is a plus.
- Strong understanding of database design principles, including normalization, indexing, partitioning, and query optimization.
- Ability to design and maintain efficient, scalable, and well-structured database schemas to support both analytical and transactional workloads,
- Familiarity with BI visualization tools such as MicroStrategy, Power BI, Looker, or similar.
- Familiarity with data modeling tools.
- Familiarity with DevOps practices for data (CI/CD pipelines)
- Proficiency in project management software (e.g., JIRA, Clarizen, etc.)
- Familiarity with DevOps practices for data (CI/CD pipelines)
- Strong knowledge and skills in data management, data quality, and data governance.
- Strong communication, collaboration, and problem-solving skills.
- Ability to work on multiple projects and prioritize tasks effectively.
- Ability to work independently and in a team environment.
- Ability to learn new technologies and tools quickly.
- The ability to handle stressful situations.
- Highly developed business acuity and acumen.
- Strong critical thinking and decision-making skills.
Working Conditions & Physical Demands
This position requires in-person office presence at least 4x a week.
Compensation and Benefits
The salary range for this position is $75,868 - $150,644. Placement in the range depends on several factors, including experience, skills, education, geography, and budget considerations.
Wakefern is proud to offer a comprehensive benefits package designed to support the health, well-being, and professional development of our Associates. Benefits include medical, dental, and vision coverage, life and disability insurance, a 401(k) retirement plan with company match & annual company contribution, paid time off, holidays, and parental leave.
Associates also enjoy access to wellness and family support programs, fitness reimbursement, educational and training opportunities through our corporate university, and a collaborative, team-oriented work environment. Many of these benefits are fully or partially funded by the company, with some subject to eligibility requirements
Description
The Data Engineer is responsible for designing, building, and maintaining scalable data pipelines to support the bank's analytics, reporting, and decision-making processes. Working closely with analysts, reporting, integration teams and business stakeholders to ensure high-quality, secure, and efficient data solutions that comply with financial regulations and industry standards.
Below is a list of essential functions of this position. Additional responsibilities may be assigned in the position.
KEY RESPONSIBILITIES
- Build and maintain data models, schemas, and databases (e.g., data warehouses, data lakes) to support business intelligence, machine learning, and reporting needs.
- Ensure data is optimized for performance, reliability, and scalability, minimizing latency and maximizing throughput.
- Build required infrastructure for optimal extraction, transformation and loading of data from various data sources using cloud and SQL technologies
- Implement data quality checks, monitoring, and validation processes to ensure accuracy, consistency, and compliance with regulatory requirements.
- Partner with business analyst, data Integration, Automation, and IT Teams to understand data requirements and deliver solutions that align with business goals.
- Ensure data adherence to strict security protocols and regulatory standards including encryption, access controls, and audit trails.
- Champion data governance, quality standards, and performance optimization.
- Create and maintain comprehensive documentation for data schemas, processes and systems to ensure transparency and reproducibility.
ATTITUDES
Builds positive relationships with internal and external clients by valuing other's feelings and rights in both words and actions, and embracing other's unique beliefs, backgrounds, and perspectives by demonstrating:
- Respect - treat every client and colleague with dignity and respect.
- Client Focus - Design scalable and reliable data pipelines that directly support the client's business goals and decision-making needs. Actively engage with stakeholders to understand evolving requirements and deliver solutions that provide timely, actionable insights
- Inclusion - Support a diverse work environment by building data systems that are accessible, equitable, and considerate of user needs, while actively seeking input from voices across all backgrounds and roles.
BEHAVIORS
Demonstrates strong business ethics and honest behaviors and the ability to positively influence and work with others to achieve excellent results by demonstrating:
- Leadership - Proactively drives data strategy, mentoring peers, and sets high standards for quality, innovation, and collaboration across teams.
- Integrity - Establish and enforce program governance frameworks, including change control and release management.
- Collaboration - Works with stakeholders across all departments to drive data efforts. Serves as a key contributor between business stakeholders and technical teams.
- Volunteerism - Use your skill beyond the role by mentoring others, helping teammates, and supporting meaningful causes.
COMPETENCIES
Reflects skill, good judgement, positive conduct, and personal responsibility for assigned areas. Seeks to implement and leverage services and technologies that create efficiencies by demonstrating:
- Accountability - Takes ownership of work, ensuring data systems are reliable and accurate. Promptly addresses issues or errors with transparency and responsibility.
- Innovation - Embrace new ideas, new tools, and bold thinking; challenge the status quo.
- Professionalism - consistently demonstrates courteous behavior, integrity, and strong work ethic while representing the bank with a polished appearance and clear communication.
POSITION LEVEL(S) EXPECTATIONS
- Strong understanding of Data Models, databases, schemas, and security methodologies.
- Excellent leadership, strategic thinking, and stakeholder management skills.
SEEKS PROFESSIONAL DEVELOPMENT OPPORTUNITIES
Actively participate in expanding skill sets and career paths by attending training programs, workshops, certifications, and educational resources relevant to the role. Set stretch assignments and cross functional opportunities that foster growth and learning.
Requirements
QUALIFICATIONS, EDUCATION, & EXPERIENCE
To perform this position successfully, an individual must be able to perform each essential position requirement satisfactorily, and a skills inventory is listed below.
- Bachelor's degree in a technology related program or 3-5 years' experience a data related field.
- Strong understanding of data architecture and data base design principles.
- Strong leadership and communication skills across technical and non-technical audiences.
- 3-5 Years experience in Data roles.
- Proficiency in languages such as Python, Java, Scala, or SQL.
- Experience in financial services (banking, insurance, wealth management).
- Excellent problem-solving and communication skills, with a collaborative mindset.
- Demonstrated leadership and self-direction.
- A background screening will be conducted.
LANGUAGE SKILLS: Ability to read, comprehend, and interpret documents. Possesses professional communication and interpersonal skills to write and speak effectively both one-on-one and before groups of clients or employees of the organization. Ability to communicate to clients directly and effectively.
TECHNOLOGY SKILLS: Ability to utilize telephone systems and possess good digital literacy including email, internet and intranet use. Strong understanding of Salesforce platform capabilities and implementation methodologies.
MATHEMATICAL SKILLS: Ability to add, subtract, multiply, and divide in all units of measure.
REASONING ABILITY: Ability to apply common sense understanding to carry out instructions furnished in written, oral, or diagram form. Ability to solve challenging problems involving several variables in a standardized situation.
PHYSICAL DEMANDS AND WORK ENVIRONMENT: The physical demands and work environment described here are representative of those that must be met by an employee to successfully perform the essential functions of this position.
This position operates in a professional office environment with considerable time spent at a desk using office equipment such as computers, phones, and printers. Ability to travel on occasion to all market areas and attend seminars or training sessions offsite and employee meetings off-site.
Reasonable accommodation may be made to enable individuals with disabilities to perform the essential functions.
DISCLAIMER: This job description is not an exclusive list of responsibilities and duties. They may change at any time without notice.
BENEFITS
- Medical, Dental, Vision & Life Insurance
- 401K with company match
- Paid Time Off & Recognized Holidays
- Leave policies
- Voluntary Benefit Options (Life, Accident, Critical Illness, Hospital Indemnity & Pet)
- Employee Assistance Program
- Employee Health & Wellness Program
- Special Loan and Deposit Rates
- Gradifi Student Loan Paydown Plan
- Rewards & Recognition Programs and much more!
Eligibility requirements apply.
CNB Bank is an equal opportunity employer and all applicants are considered based on qualifications without regard to sex, race, color, ancestry, religious creed, national origin, sexual orientation, gender identity, physical disability, mental disability, age, marital status, disabled veteran or Vietnam era veteran status. CNB Financial Corporation is an Affirmative Action Employer and is committed to fostering, cultivating and preserving a culture of diversity and inclusion.
?LicenceId=5a7398f0-7edb-4cb7-a02b-518dcfa222fa&ProductType=IntranetLicense&SubType=PG
Job Summary:
Our client is seeking a Senior Data Analytics Engineer (Customer Data) to join their team! This position is located in Irving, Texas.
Duties:
- Support cross-functional teams including Marketing, Data Science, Product, and Digital
- Build datasets that power: customer segmentation, personalization workflows, campaign and lifecycle analytics, BI dashboards and KPIs and real-time and ML-driven customer experiences
- Build, optimize, and maintain customer data pipelines using PySpark/Databricks
- Transform raw customer data into analytics‑ready datasets for reporting, segmentation, personalization, and AI/ML applications
- Develop customer behavior metrics, campaign insights, and lifecycle reporting layers
- Design datasets used by Power BI/Tableau; dashboard creation is a plus, not required
- Optimize Databricks performance such as: skewed joins, partitioning, sorting, caching/persist strategy
- Work across AWS/Azure/GCP and integrate pipelines with CDPs
- Participate in ingestion and digestion phases to shape MarTech and BI analytical layers
- Document and uphold data engineering standards, governance, and best practices across teams
Desired Skills/Experience:
- 6+ years in Data Engineering or Analytics Engineering
- Strong hands-on experience with: Databricks, PySpark, Python and SQL
- Proven experience with customer/marketing data: segmentation, personalization, campaign analytics, retention, behavioral metrics
- Ability to design performance‑optimized pipelines; batch or near real-time
- Experience building datasets consumed by Power BI/Tableau
- Understanding of CDP workflows, customer identity data, traits/feature modeling, and activation
- Strong communication skills, translating marketing needs into technical data solutions
- Power BI expertise, major plus
- Experience with Delta Lake, orchestration, or feature engineering for ML
- Background as an Analytics Engineer, BI/Data Modeling Engineer, or Data Engineer with strong analytics orientation
Benefits:
- Medical, Dental, & Vision Insurance Plans
- Employee-Owned Profit Sharing (ESOP)
- 401K offered
The approximate pay range for this position starting at $140,000. Please note that the pay range provided is a good faith estimate. Final compensation may vary based on factors including but not limited to background, knowledge, skills, and location. We comply with local wage minimums.
At KellyMitchell, our culture is world class. We’re movers and shakers! We don’t mind a bit of friendly competition, and we reward hard work with unlimited potential for growth. This is an exciting opportunity to join a company known for innovative solutions and unsurpassed customer service. We're passionate about helping companies solve their biggest IT staffing & project solutions challenges. As an employee-owned, women-led organization serving Fortune 500 companies nationwide, we deliver expert service at a moment's notice.
By applying for this job, you agree to receive calls, AI-generated calls, text messages, or emails from KellyMitchell and its affiliates, and contracted partners. Frequency varies for text messages. Message and data rates may apply. Carriers are not liable for delayed or undelivered messages. You can reply STOP to cancel and HELP for help. You can access our privacy policy at
About Pinterest:
Millions of people around the world come to our platform to find creative ideas, dream about new possibilities and plan for memories that will last a lifetime. At Pinterest, we're on a mission to bring everyone the inspiration to create a life they love, and that starts with the people behind the product.
Discover a career where you ignite innovation for millions, transform passion into growth opportunities, celebrate each other's unique experiences and embrace theflexibility to do your best work. Creating a career you love? It's Possible.
At Pinterest, AI isn't just a feature, it's a powerful partner that augments our creativity and amplifies our impact, and we're looking for candidates who are excited to be a part of that. To get a complete picture of your experience and abilities, we'll explore your foundational skills and how you collaborate with AI.
Through our interview process, what matters most is that you can always explain your approach, showing us not just what you know, but how you think. You can read more about our AI interview philosophy and how we use AI in our recruiting process here.
About tvScientific
tvScientific is the first and only CTV advertising platform purpose-built for performance marketers. We leverage massive data and cutting-edge science to automate and optimize TV advertising to drive business outcomes. Our solution combines media buying, optimization, measurement, and attribution in one, efficient platform. Our platform is built by industry leaders with a long history in programmatic advertising, digital media, and ad verification who have now purpose-built a CTV performance platform advertisers can trust to grow their business.
As a Senior Data Engineer at tvScientific, you will be a key player in implementing the robust data infrastructure to power our data-heavy company. You will collaborate with our cross-functional teams to evolve our core data pipelines, design for efficiency as we scale, and store data in optimal engines and formats. This is an individual contributor role, where you will work to define and implement a strategic vision for data engineering within the organization.
What you'll do:
- Implement robust data infrastructure in AWS, using Spark with Scala
- Evolve our core data pipelines to efficiently scale for our massive growth
- Store data in optimal engines and formats
- Collaborate with our cross-functional teams to design data solutions that meet business needs
- Built out fault-tolerant batch and streaming pipelines
- Leverage and optimize AWS resources while designing for scale
- Collaborate closely with our Data Science and Product teams
- How we'll define success:
- Successful implementation of scalable and efficient data infrastructure
- Timely delivery and optimization of data assets and APIs
- High attention to detail in implementation of automated data quality checks
- Effective collaboration with cross-functional teams
What we're looking for:
- Production data engineering experience
- Proficiency in Spark and Scala, with proven experience building data infrastructure in Spark using Scala
- Familiarity with data lakes, cloud warehouses, and storage formats
- Strong proficiency in AWS services
- Expertise in SQL for data manipulation and extraction
- Excellent written and verbal communication skills
- Bachelor's degree in Computer Science or a related field
- Nice-to-Haves
- Experience in adtech
- Experience implementing data governance practices, including data quality, metadata management, and access controls
- Strong understanding of privacy-by-design principles and handling of sensitive or regulated data
- Familiarity with data table formats like Apache Iceberg, Delta
In-Office Requirement Statement:
- We recognize that the ideal environment for work is situational and may differ across departments. What this looks like day-to-day can vary based on the needs of each organization or role.
Relocation Statement:
- This position is not eligible for relocation assistance. Visit ourPinFlexpage to learn more about our working model.
#LI-SM4
#LI-REMOTE
At Pinterest we believe the workplace should be equitable, inclusive, and inspiring for every employee. In an effort to provide greater transparency, we are sharing the base salary range for this position. The position is also eligible for equity. Final salary is based on a number of factors including location, travel, relevant prior experience, or particular skills and expertise.
Information regarding the culture at Pinterest and benefits available for this position can be found here.
US based applicants only$123,696—$254,667 USDOur Commitment to Inclusion:
Pinterest is an equal opportunity employer and makes employment decisions on the basis of merit. We want to have the best qualified people in every job. All qualified applicants will receive consideration for employment without regard to race, color, ancestry, national origin, religion or religious creed, sex (including pregnancy, childbirth, or related medical conditions), sexual orientation, gender, gender identity, gender expression, age, marital status, status as a protected veteran, physical or mental disability, medical condition, genetic information or characteristics (or those of a family member) or any other consideration made unlawful by applicable federal, state or local laws. We also consider qualified applicants regardless of criminal histories, consistent with legal requirements. If you require a medical or religious accommodation during the job application process, please completethis formfor support.
About Pinterest:
Millions of people around the world come to our platform to find creative ideas, dream about new possibilities and plan for memories that will last a lifetime. At Pinterest, we're on a mission to bring everyone the inspiration to create a life they love, and that starts with the people behind the product.
Discover a career where you ignite innovation for millions, transform passion into growth opportunities, celebrate each other's unique experiences and embrace theflexibility to do your best work. Creating a career you love? It's Possible.
At Pinterest, AI isn't just a feature, it's a powerful partner that augments our creativity and amplifies our impact, and we're looking for candidates who are excited to be a part of that. To get a complete picture of your experience and abilities, we'll explore your foundational skills and how you collaborate with AI.
Through our interview process, what matters most is that you can always explain your approach, showing us not just what you know, but how you think. You can read more about our AI interview philosophy and how we use AI in our recruiting process here.
About tvScientific
tvScientific is the first and only CTV advertising platform purpose-built for performance marketers. We leverage massive data and cutting-edge science to automate and optimize TV advertising to drive business outcomes. Our solution combines media buying, optimization, measurement, and attribution in one, efficient platform. Our platform is built by industry leaders with a long history in programmatic advertising, digital media, and ad verification who have now purpose-built a CTV performance platform advertisers can trust to grow their business.
As aStaff Data Engineer at tvScientific, you will be a key player in implementing the robust data infrastructure to power our data-heavy company. You will collaborate with our cross-functional teams to evolve our core data pipelines, design for efficiency as we scale, and store data in optimal engines and formats.This is anindividual contributor role, where you will work to define and implement a strategic vision for data engineering within the organization.
What you'll do:
- Design and implement robust data infrastructure in AWS, using Spark with Scala
- Evolve our core data pipelines to efficiently scale for our massive growth
- Store data in optimal engines and formats, matching your designs to our performance needs and cost factors
- Collaborate with our cross-functional teams to design data solutions that meet business needs
- Design and implement knowledge graphs, exposing their functionality both via Batch Processing and APIs
- Leverage and optimize AWS resources while designing for scale
- Collaborate closely with our Data Science and Product teams
- How we'll define success:
- Successful design and implementation of scalable and efficient data infrastructure
- Timely delivery and optimization of data assets and APIs
- High attention to detail in implementation of automated data quality checks
- Effective collaboration with cross-functional teams
What we're looking for:
- Production data engineering experience
- Proficiency in Spark and Scala, with proven experience building data infrastructure in Spark using Scala
- Experience in delivering significant technical initiatives and building reliable, large scale services
- Experience in delivering APIs backed by relationship-heavy datasets
- Familiarity with data lakes, cloud warehouses, and storage formats
- Strong proficiency in AWS services
- Expertise in SQL for data manipulation and extraction
- Excellent written and verbal communication skills
- Bachelor's degree in Computer Science or a related field
- Nice-to-haves:
- Experience in adtech
- Experience implementing data governance practices, including data quality, metadata management, and access controls
- Strong understanding of privacy-by-design principles and handling of sensitive or regulated data
- Familiarity with data table formats like Apache Iceberg, Delta
- Previous experience building out a Data Engineering function
- Proven experience working closely with Data Science teams on machine learning pipelines
In-Office Requirement Statement:
- We recognize that the ideal environment for work is situational and may differ across departments. What this looks like day-to-day can vary based on the needs of each organization or role.
Relocation Statement:
- This position is not eligible for relocation assistance. Visit ourPinFlexpage to learn more about our working model.
#LI-SM4
#LI-REMOTE
At Pinterest we believe the workplace should be equitable, inclusive, and inspiring for every employee. In an effort to provide greater transparency, we are sharing the base salary range for this position. The position is also eligible for equity. Final salary is based on a number of factors including location, travel, relevant prior experience, or particular skills and expertise.
Information regarding the culture at Pinterest and benefits available for this position can be found here.
US based applicants only$155,584—$320,320 USDOur Commitment to Inclusion:
Pinterest is an equal opportunity employer and makes employment decisions on the basis of merit. We want to have the best qualified people in every job. All qualified applicants will receive consideration for employment without regard to race, color, ancestry, national origin, religion or religious creed, sex (including pregnancy, childbirth, or related medical conditions), sexual orientation, gender, gender identity, gender expression, age, marital status, status as a protected veteran, physical or mental disability, medical condition, genetic information or characteristics (or those of a family member) or any other consideration made unlawful by applicable federal, state or local laws. We also consider qualified applicants regardless of criminal histories, consistent with legal requirements. If you require a medical or religious accommodation during the job application process, please completethis formfor support.
The pay range for this role is $150,000 - $200,000/yr USD.
WHO WE ARE:
Headquartered in Southern California, Skechers—the Comfort Technology Company®—has spent over 30 years helping men, women, and kids everywhere look and feel good. Comfort innovation is at the core of everything we do, driving the development of stylish, high-quality products at a great value. From our diverse footwear collections to our expanding range of apparel and accessories, Skechers is a complete lifestyle brand.
ABOUT THE ROLE:
Skechers Digital Team is seeking a Digital Data Architect reporting to the Director, Digital Architecture, Consumer Domain. This role is responsible for designing and governing Skechers’ Consumer Data 360 ecosystem, enabling identity resolution, high-quality data foundations, personalization, loyalty intelligence, and machine learning capabilities across digital and retail channels.
The ideal candidate will be a strong technical leader, have hands-on full-stack technical knowledge in enterprise technologies related to Skecher’s consumer domain, and have the ability to work in a fast-paced agile environment. You should have knowledge of consumer programs from an architecture/industry perspective, and you should have strong hands-on experience designing solutions on the Salesforce Core Platform (including configuration, integration, and data model best practices).
You will work cross-functionally with Digital Engineering, Data Engineering, Data Science, Loyalty, and Marketing teams to architect scalable, secure, and high-performance data platforms that support advanced personalization and recommender systems.
WHAT YOU’LL DO:
- Responsible for the full technical life cycle of consumer platform capabilities which includes:
- Capability roadmap and technical architecture in alignment to consumer experience
- Technical planning, design, and execution
- Operations, analytics/reporting, and adoption
- Define and evolve Skechers’ Consumer Data 360 architecture, including identity resolution (deterministic and probabilistic matching) and unified customer profiles.
- Architect scalable data models and pipelines across CDP, CRM, e-commerce, marketing automation, data lake, and warehouse platforms.
- Establish enterprise data quality frameworks including validation, deduplication, anomaly detection, and observability.
- Optimize SQL workloads and large-scale distributed queries through performance tuning, partitioning, indexing, and workload management strategies.
- Design and oversee ML pipelines supporting personalization, churn modeling, and recommender systems.
- Partner with Data Science teams to productionize models using distributed platforms such as Databricks (Spark, Delta Lake, MLflow preferred).
- Ensure secure data governance, access control (RBAC/ABAC), and compliance with GDPR, CCPA, and related privacy regulations.
- Provide architectural oversight ensuring performance, scalability, resilience, and maintainability.
- Collaborate with stakeholders to translate business objectives (LTV growth, personalization lift, engagement) into scalable data solutions.
REQUIREMENTS:
- Computer Science, Data Engineering, or related degree or equivalent experience.
- 12+ years experience architecting enterprise data platforms in cloud environments.
- 9+ years experience with data engineering with a focus on consumer data.
- 6+ years experience working with Salesforce platforms, including data models and enterprise integrations.
- Strong experience with Data 360 and identity resolution architectures.
- Proven expertise in SQL performance tuning and large-scale data modeling.
- Hands-on experience implementing ML pipelines and recommender systems in production environments.
- Experience with cloud technologies (AWS, GCP, or Azure).
- Experience with integration patterns (API, ETL, event streaming).
- Experience providing technical leadership and guidance across multiple projects and development teams.
- Experience translating business requirements into detailed technical specifications and working with development teams through implementation, including issue resolution and stakeholder communication.
- Strong project management skills including scope assessment, estimation, and clear technical communication with both business users and technical teams.
- Must hold at least one of the following Salesforce Certifications (Platform App Builder, Platform Developer 1, JavaScript Developer 1).
- Experience with Databricks or similar distributed data/ML platforms preferred.
Overall Responsibility:
This role supports the design, development, and optimization of Arora’s enterprise data and ERP systems. This role reports directly under the Data Analytics Manager to improve financial reporting, support platform integrations, and build scalable data architecture that enables informed decision-making across the organization.
The position combines technical execution (SQL, automation, system configuration) with financial reporting support and cross-platform integration work to ensure accuracy, efficiency, and long-term system sustainability.
Essential Functions:
- Execute reporting and system requests in alignment with established data governance standards and reporting frameworks under the direction of the Data Analytics Manager.
- Contribute to the design of data models and system workflows that reduce manual processes and improve cross-functional data visibility.
- Support internal dashboards by creating backend data solutions and integrating with Vision.
- Provide system-level troubleshooting and ensure data consistency and reliability across platforms.
- Collaborate with teams to streamline processes through automation and data tools.
- Maintain documentation of data procedures, workflows, and system modifications.
- Support financial reporting and analysis by developing standardized, scalable reporting solutions aligned with company-wide data architecture.
- Assist in translating financial and operational requirements into structured reporting outputs and automation workflows.
- Assist in platform integrations (ERP, CRM, BI tools, and other enterprise systems) to support long-term architectural alignment and scalability.
Needed Skills:
- Ability to program in SQL at an expert level to assist data processes. Potential need for other programming language knowledge (Java, Python, etc.).
- Ability to create and maintain productive relationships with employees, clients, and vendors.
Education/Experience Minimum:
- 3-5 years of experience
- Strong programming skills having the ability to write complex queries.
- Preferred familiarity with all Microsoft platforms, including but not limited to Excel, Power BI, SharePoint, and SQL Server.
- Preferred experience with Deltek Vision v7.6 and VantagePoint
- Experience in building automated processes and data workflows.
- Strong problem-solving and attention to detail.
Get an insider view of the fast-changing grocery retail industry while developing relevant business, technical and leadership skills geared towards enhancing your career. This paid Co-op experience is an opportunity to help drive business results in an environment designed to promote and reward diversity, innovation and leadership. Applicants must be currently enrolled in a bachelor's or master's degree program.
**Applicants must be currently authorized to work in the United States on a full-time basis and be available from July 13 2026, through December 4, 2026. We have a hybrid work environment that requires a minimum of three days a week in the office. Please submit your resume including your cumulative GPA. Transcripts may be requested at a future date.**
- Approximate 6-month Co-op session with competitive pay
- Impactful project work to develop your skills/knowledge
- Leadership speaker sessions and development activities
- One-on-one mentoring
- Involvement in group community service events
- Networking and professional engagement opportunities
- Access to online career development tools and resources
- Opportunity to present project work to company leaders
Duties & Responsibilities
The Data Integration team sits in the middle of ADUSA's Analytics business and IT teams. We bridge the gap by analyzing the business requirements, creating Tech intake forms and facilitating the deliverables within the IT team. This team also acts as Product Managers for the Agile squads that consist of developers and QA team members. As a Co-op, you will sharpen your SQL skills by working with data solutions that span grocery banners, implementing and optimizing views, creating custom tables, data connections and functions. Learn how to evaluate business processes, anticipate requirements, uncover areas for improvement, and develop and implement solutions. You will get exposure to business thinking through working closely with other analysts, managers, and executives across our banners. Work with stakeholders throughout the organization to identify opportunities for leveraging company data to drive business solutions, from reporting to advanced data modeling. You will also effectively communicate your insights and plans to cross-functional team members and management. Conduct meetings and presentations to share ideas and findings.
Qualifications
- Must be enrolled in a BA/BS, MS or PhD program or recent graduate in a related field
- Team player with great interpersonal and communication skills
- Good time-management skills
- Documentation skills
- Sense of ownership and pride in your performance and its impact on company's success
- Exposure to cloud technologies like Microsoft Azure Data Lake and Databricks concepts like Unity Catalog
- Experience with Python & SQL
- Good understanding of Atlassian suite and features specifically JIRA and Confluence
Skills:
- Python
- SQL
- Atlassian Suite
- JIRA
- Confluence
Individual cohort pay rates vary based on location, academic year, and position.
ME/NC/PA/SC Salary Range: $20.90 - $35.70
IL/MA/MD Salary Range:$22.80 - $37.30
Our client, a fintech leader with amazing culture, is hiring for a contract Data Scientist (Data Science Analyst). This is a hybrid position with 3 days a week onsite in Mountain View, CA.
This role will be on the company's product data science team, working directly with stakeholders across marketing, product and finance teams to define data requirements, executing data science initiatives (customer segmentation/attribution, campaign analysis, product targeting, experimentation, predictive modeling), and generating actionable insights/recommendations. Strong skills in SQL, Python, Tableau, and ETL pipelining are required.
Contract Duration: 12 Months to Start
Responsibilities:
- Conceptualize business problems or opportunities, formulate hypotheses and goals, define key metrics, and make actionable recommendations
- Drive strategic insights for data science supporting the product ecosystem customer upgrade/attach/monetization/migration initiatives
- Working with marketing stakeholders to define requirements, execute marketing campaign analytics, and marketing attribution.
- Develop predictive models, conduct experimentation beyond A/B testing, and generate actionable customer insights that inform product innovation
- Build and apply durable customer segmentation patterns to renew product targeting, positioning, and customer experience
- Translate complex data insights into actionable recommendations for technical and non-technical stakeholders, and business leaders
- Raise craft bar for the analysts on team
Required Skills:
- Minimum of 5-7 years of experience in business analytics and data science, analyzing business/segment performance and conversion funnels
- Ability to formulate data-backed strategies that will drive step-function growth for the business as well as increase customer benefit
- Experienced in experimentation or A/B testing, marketing campaign analytics, and marketing attribution
- Practical experience constructing data pipelines and ETL utilizing SQL and Python, as well as data solutions from cloud platforms
- Strong data storytelling skills, with a proven ability to rapidly construct impactful visualization, communicate insights and influence marketing and product leadership
- Ability to generate hypotheses grounded in customer behavior, industry trends, and external market factors.
- Experience in the SaaS industry is huge. Fintech or SMB space experience is a plus.
- Demonstrated experience in building reusable and scalable analytics solutions, with a focus on efficiency and avoiding duplication of work
- Outstanding communication skills with the ability to influence decision makers and build consensus with teams
- Quick learner, adaptable, with the ability to work independently and lead the team in a fast-paced environment
Remote working/work at home options are available for this role.