Data Analyst Skills Examples Jobs in Usa
20,396 positions found — Page 6
Role : Technical Product Manager ( Data / Analytics )
Location - Austin, TX ( Onsite )
Exp Req : 10+
Skills Mandatory :
1, Marketing Data Analysis knowledge.
2, KPI and metrics definition on Marketing Data. Mainly for media product.
3, Instrumentation knowledge and through process.
Original JD:
- Key Qualification 7+ years of experience in a Data Visualization, Data Scientist, or Data Analyst role, preferably for a digital subscription business.
- Strong proficiency with SQL-based languages is required. Experience with large-scale data technologies such as Hadoop, PySpark
- Proficiency with data visualization tools such as Tableau, , and/or MicroStrategy for analysis, insight synthesis, data product delivery, and executive presentation.
- You have a curious business mindset with an ability to condense complex concepts and analysis into clear and concise takeaways that drive action.
- Excellent communication, social, and presentation skills with meticulous attention to detail.
- Strong time management skills with the ability to handle multiple projects with tight deadlines and executive visibility.
- Be known for successfully bridging analytics and business teams, with an ability to speak the language of both.
Job Description :
- Build dashboards, self-service tools, and reports to analyze and present data associated with customer experience, product performance, business operations, and strategic decision-making.
- Create datasets, Develop global dashboards, data pipelines, sophisticated security controls, and scalable ad-hoc reporting
- Closely partner with our Data Science team to define metrics, datasets, and automation strategy
- Engage with Product, Business, Engineering, and Marketing teams to capture requirements, influence how our services are measured, and craft world-class tools to support those partners.
- Establish a comprehensive roadmap to communicate and manage our commitments and stakeholder expectations while enabling org-wide transparency on progress.
- Focus on scale and efficiency - create and implement innovative solutions and establish best practices across our full scope of delivery
- Education Minimum of a Bachelor's degree in Computer Science, Statistics, Mathematics, Engineering, Economics, or related field. Technical Product Management
Key Qualifications :
- Experience in a Technical Product Management role, preferably for a digital-media or subscription business.
- Knowledge of Client-Server metrics logging strategies as well as data architecture required for analysis
- Hands-on experience with the end-to-end data lifecycle across petabyte-scale technologies
- Prior experience in a technical role (preferably as a data analyst or engineer), delivering data insights to stakeholders
- Strong experience designing and driving product strategy cross-functionally, collaborating with partners of various technical levels.
Nice to have :
• Experience in data-related programming languages (e.g. SQL, PySpark, Python, or R)
Description :
- Data is our product. We are looking for a self-starting, upbeat individual with excellent communication skills who is passionate about managing and developing critical datasets to maximize Data Science capabilities. You should have a strong interest in driving large-scale data products, engaging with key business stakeholders, and driving critical communications throughout the business.
Job Title: Automotive EDI Business Analyst (Plex ERP)
Location: [Detroit Metro / Hybrid / Remote]
Employment Type: [Full-time / Contract]
Level: Mid-Senior (5+ years automotive EDI)
Position Summary
We are seeking an experienced Automotive EDI Business Analyst with deep expertise in Plex Systems ERP and OEM-specific EDI transactions for Ford Motor Company, Stellantis, and Toyota supply chains.
This role will act as the liaison between business, IT, trading partners, and OEM customers to design, implement, and support EDI integrations within a Plex-based automotive manufacturing environment. The ideal candidate understands automotive release/accounting processes, cumulative tracking, and OEM EDI requirements across the procure-to-ship lifecycle.
Automotive suppliers rely on Plex’s built-in EDI and release management capabilities to automate orders, shipping notifications, and material planning while ensuring OEM compliance.
Key Responsibilities
EDI & OEM Integration
- Lead onboarding and maintenance of OEM and Tier-1 trading partners (Ford, Stellantis, Toyota)
- Analyze, map, and validate automotive EDI transactions, including:
- 830 / DELFOR (Forecast)
- 862 / DELJIT (Sequenced Release)
- 850 / 855 (PO / Acknowledgment)
- 856 / DESADV (ASN)
- 810 (Invoice)
- Ensure compliance with OEM EDI implementation guidelines and MMOG/LE standards
- Coordinate EDI testing, certification, and production cutover with OEMs and VAN providers
Plex ERP Functional Analysis
- Configure and support Plex EDI source documents and release accounting
- Align Plex cumulative releases, shipping, and ASN processes with OEM requirements
- Support Plex modules impacting EDI flows:
- Customer releases & shipping
- Inventory & MRP
- Logistics / ASN
- Quality & traceability
- Troubleshoot EDI-to-ERP data flow issues and transaction failures
Business Analysis & Process Improvement
- Gather and document EDI and supply-chain requirements from operations, logistics, and customer service
- Create functional specs, mapping documents, and data flow diagrams
- Identify automation opportunities in order-to-cash and procure-to-pay processes
- Drive continuous improvement in EDI reliability, ASN accuracy, and release processing
Stakeholder & Vendor Management
- Interface with OEM EDI coordinators and customer portals
- Work with EDI providers (OpenText, Cleo, TrueCommerce, etc.)
- Coordinate with Plex integrators and internal IT teams
- Train business users on Plex EDI and release workflows
Required Qualifications
- 5+ years automotive EDI experience in a Tier-1/Tier-2 supplier environment
- Hands-on experience with Plex ERP EDI or release management
- Direct OEM EDI experience with at least two of:
- Ford
- Stellantis
- Toyota
- Strong knowledge of automotive EDI standards:
- ANSI X12
- EDIFACT (DELFOR, DELJIT, DESADV)
- VDA (4905, 4913, 4915)
- Experience with:
- EDI mapping & testing
- ASN and cumulative releases
- Sequencing / JIT / JIS
- Understanding of automotive supply-chain processes (MMOG/LE, IATF)
Preferred Qualifications
- Plex implementation or upgrade project experience
- EDI VAN or platform experience (OpenText, IBM Sterling, Cleo, etc.)
- Experience supporting multiple OEM customers simultaneously
- SQL/data analysis skills for EDI troubleshooting
- APQP / PPAP / automotive quality familiarity
- Microsoft Location: St.
Louis, MO (Hybrid) Pay: $40-50/hr The candidate in this position will be responsible for assisting with the management of Software Licensing for Microsoft licensed software products.
Responsibilities include maintaining software license compliance per the contract terms and conditions, ensuring timely software license renewal by following the Agile framework, operating client’s asset management tools, and adhering to client’s Asset Management standards among other activities.
Responsibilities: Setup, Configure and Maintain application and entitlement data within Flexera toolset to maintain trustworthy and accurate data for each product under management.
Routinely review and identify data issues using defined measurements and KPI’s.
Remediate any identified data inaccuracies or incompleteness to improve trustworthiness and accuracy of data.
Work with Stakeholders to collect, verify, and maintain data required to remediate gaps.
Follow all defined Software Asset Management Center of Excellence standards including Key Performance Indicators to maintain data quality.
Analyze and Review Microsoft contracts and data to identify compliance risks and optimization opportunities to manage spend.
Create and analyze reports for stakeholders as requested.
Assist in establishing the strategic direction for the Software Asset Management team.
Embrace a lean agile mindset.
Contribute to continuous service improvement efforts to improve processes, tools and data to promote an efficient and effective asset management program.
Qualifications: Bachelor’s degree PREFERRED, or equivalent combination of education, training, and/or experience.
At least 5 years Asset Management experience preferred.
Experience in analytical, technology, or business roles will also be considered.
Experience with reviewing and interpreting software license contracts and understanding of licensing terms, including Microsoft.
Strong Data Analysis skills.
Must possess ability to critically analyze data, relate it to business value and impact.
Strong written and verbal communication skills.
Must possess ability to communicate with different stakeholders and different levels of organization.
In-depth understanding of enterprise level of Microsoft software agreements.
In-depth understanding of software license agreements in general; what to look for, what to avoid, etc.
Ability to learn and maintain updated knowledge of Microsoft license models Strong organizational and problem-solving skills Experience with agile methodologies preferred Microsoft Software audit experience a plus Familiarity with Toolsets (Flexera, ServiceNow, Ariba, or similar) and MS Office
About US
C5i is a pure-play AI & Analytics provider that combines the power of human perspective with AI technology to deliver trustworthy intelligence. The company drives value through a comprehensive solution set, integrating multifunctional teams that have technical and business domain expertise with a robust suite of products, solutions, and accelerators tailored for various horizontal and industry-specific use cases. At the core, C5i’s focus is to deliver business impact at speed and scale by driving adoption of AI-assisted decision-making.
C5i caters to some of the world’s largest enterprises, including many Fortune 500 companies. The company’s clients span Technology, Media, and Telecom (TMT), Pharma & Lifesciences, CPG, Retail, Banking, and other sectors. C5i has been recognized by leading industry analysts like Gartner and Forrester for its Analytics and AI capabilities and proprietary AI-based platforms.
Global offices
United States | Canada | United Kingdom | United Arab of Emirates| India
Job Description:
Overview: This role involves building data products to extract valuable business insights and requires a highly analytical individual with strong problem-solving skills and a passion for machine learning and research.
Responsibilities
- Undertake data collection, preprocessing, and analysis
- Build models to address business problems
- Present information using data visualization techniques
- Propose solutions and strategies to business challenges
- Collaborate with engineering and product development teams
- Develop machine learning algorithms
- Conduct data-driven experiments to drive business decisions
Required Skills
- Data mining
- Machine learning and operations research
- Proficiency in R, SQL, and Python (knowledge of Scala, Java, or C++ is a plus)
- Experience with business intelligence tools (e.g., Tableau) and data frameworks (e.g., Hadoop)
- Strong math skills (e.g., statistics, algebra)
- Analytical mind and business acumen
- Excellent communication and presentation skills
- Preferred Algorithms and Use Cases:
- Text Analytics: Natural Language Processing (NLP) algorithms for sentiment analysis, text classification, named entity recognition.
- Voice Analytics: Speech recognition, voice-to-text conversion, emotion detection in voice.
- Image Analytics: Image classification, object detection, facial recognition.
Qualifications
- Proven 5+ years of experience as a Data Scientist or Data Analyst
- Bachelor's or master's degree in computer science, Engineering, or relevant field; graduate degree in Data Science or other quantitative field is preferred
C5i is proud to be an equal opportunity employer. We are committed to equal employment opportunity regardless of race, color, religion, sex, sexual orientation, age, marital status, disability, gender identity, etc. If you have a disability or special need that requires accommodation, please keep us informed about the same at the hiring stages for us to factor necessary accommodations.
Position Summary
We are seeking a detail-oriented HRIS Analyst with strong experience in benefits enrollment and HR systems administration. This role will support the configuration, maintenance, and optimization of HRIS platforms, with a particular focus on benefits administration, open enrollment processes, and data integrity.
Key Responsibilities
- Administer and maintain HRIS systems, ensuring accurate employee data and system functionality
- Lead and support benefits enrollment processes, including open enrollment and life event changes
- Configure and test system updates related to benefits plans, eligibility rules, and workflows
- Serve as the primary point of contact for HRIS-related benefits issues and troubleshooting
- Collaborate with HR, Payroll, and Benefits teams to ensure seamless data integration
- Generate and analyze reports related to benefits participation, enrollment trends, and compliance
- Ensure compliance with federal, state, and internal benefits regulations and policies
- Support vendor integrations and file feeds for benefits providers
- Assist with system upgrades, implementations, and process improvements
Qualifications
- Bachelor's degree in Human Resources, Information Systems, Business, or related field
- 3+ years of HRIS experience, with a strong focus on benefits enrollment and administration
- Hands-on experience with HRIS platforms (e.g., Workday, UKG, ADP, or similar)
- Knowledge of benefits processes, including open enrollment, eligibility, and compliance
- Strong analytical, problem-solving, and data management skills
- High attention to detail and ability to manage sensitive information confidentially
- Excellent communication and cross-functional collaboration skills
Overview
We are seeking a seasoned Analytics leader to build and lead our enterprise Analytics and Data Governance function in a modern group purchasing / procurement environment. This leader will turn our rich ecosystem of member, supplier, contract, and transaction data into a strategic asset that drives savings, compliance, growth, and differentiated insight for our members and suppliers.
This leader will also own the data governance operating model, enterprise metrics, and analytics roadmap that power member-facing insights, internal performance management, and AI use cases across the technology platform (Website, B2B eCommerce, supplier portal, sourcing tools, and partner integrations).
Key responsibilities
Data governance and policy
- Define and run the enterprise data governance framework covering member, supplier, contract, item, and transaction data domains.
- Establish data ownership and stewardship across functions (Category Management, Supplier Management, Finance, Sales, Marketing, Digital) driving clear accountabilities for data quality and definitions.
- Implement policies for responsible use of data in supplier programs, member reporting, and AI/ML models, ensuring compliance with contractual, regulatory, and privacy requirements.
- Drive data quality management (profiling, remediation, SLAs) for critical assets such as contract price files, item catalogs, rebate/accrual data, and member hierarchies.
- Oversee metadata, business glossary, and data lineage so teams can confidently understand "one source of truth" for core GPO metrics (e.g., committed vs. actual spend, penetration, compliance, savings delivered).
Analytics strategy and delivery
- Define the enterprise analytics vision and roadmap aligned to procurement value levers: spend visibility, category performance, contract compliance, leakage detection, rebate optimization, and supplier performance.
- Lead the design and delivery of standardized KPI suites and dashboards for executives, category teams, supplier partners, and member account teams (e.g., savings scorecards, compliance heatmaps, portfolio optimization).
- Partner with Product and Engineering to ensure the data platform (warehouse, semantic layer, BI tools) can support self-service analytics, embedded insights in member/supplier portals, and AI-driven use cases.
- Champion enterprise metrics and advanced analytics capabilities such as, forecasting, benchmarking, opportunity sizing, and integrity analytics, ensuring models are traceable, governed, and auditable.
- Translate business needs into clear data products (curated data sets, subject-area marts, APIs) that serve both internal teams and external-facing solutions.
Stakeholder leadership and collaboration
- Serve as the enterprise "single point of accountability" for data and analytics, aligning priorities across Technology, Category Management, Supplier Relations, Sales, Finance, and Operations.
- Partner with Supplier and Member-facing teams to co-create analytics offerings that differentiate the GPO (e.g., supplier growth playbooks, member CFO dashboards, public-sector transparency packs).
- Educate executives and business leaders on data literacy, standard metrics, and how to use insights in planning, negotiations, and supplier programs.
- Collaborate closely with Security, Legal, and Compliance to ensure that member and supplier data is used ethically and in line with contracts and regulations.
Team building and operations
- Build and lead a high-performing team of data analysts, analytics engineers, data governance managers, and data stewards.
- Define operating rhythms (data council, data domain forums, metric review cadences) that keep governance and analytics tightly connected to business outcomes.
- Establish and track KPIs for the data function itself (data quality scores, adoption of governed datasets, BI usage, time-to-insight).
- Select and manage key tools and vendors in the analytics and governance ecosystem (warehouse, BI, catalog/governance, quality monitoring).
Qualifications
- Bachelor's or Master's degree in Data/Computer Science, Information Systems, Analytics, Statistics, Business, or related field.
- 10+ years of experience in analytics, data governance, or enterprise data management, including 3–5+ years leading teams.
- Proven experience in a procurement, supply chain, GPO, distribution, or B2B marketplace environment strongly preferred.
- Demonstrated success implementing data governance frameworks and delivering analytics that directly influenced commercial or procurement outcomes (e.g., savings, compliance, supplier growth).
- Hands-on familiarity with modern data platforms (e.g., Snowflake/BigQuery/Redshift, dbt, Power BI/Tableau/Looker, and one or more data catalog/governance tools).
- Strong grasp of regulatory / contractual considerations relevant to member and supplier data (data sharing agreements, use of benchmarking, privacy/security standards).
- Excellent leadership, storytelling, and stakeholder management skills; able to influence at C-suite and board levels.
Attributes for success
- Business-first mindset: instinctively ties data work to member value, supplier value, and financial impact.
- Pragmatic operator: balances governance rigor with speed, enabling innovation rather than blocking it.
- Skilled translator: can convert complex data and AI topics into clear narratives for executives, sales, and category leaders.
- Culture builder: passionate about creating a data-driven culture that values standard definitions, trusted data, and measurable outcomes.
Compensation:
$150,000 to $200,000 per year annual salary.
Exact compensation may vary based on several factors, including skills, experience, and education.
Benefit packages for this role include: Benefit packages for this role may include healthcare insurance offerings and paid leave as provided by applicable law.
With data being the fuel that drives our future - our strategies, policies, and business successes around data will define our future growth prospects. Unlocking the value available through the innovative use of data on behalf of consumers, businesses, and communities is key to our future. With our ongoing commitment to Visa’s Data Values and the responsible use of data, we at Visa have a bold vision to continue to grow and accelerate our data-
The AI Products & Analytics team under the Global Data Office is creating the next generation of scalable and responsible AI, ML and Data solutions and products to solve client and consumer problems. We are a cross‑functional team of data scientists, product/program managers, data engineers and ML Engineers focused on generating value for the payment ecosystem. We are dreaming of the next generation of AI features and products, Agentic AI solutions and high‑quality analytics and data science support for our internal partner teams.
This position is in the AI Practices & COE sub‑team under the AI Products & Analytics team, focused on AI Transformation of the Global Data Office. The AI Transformation program aims to accelerate operational efficiency and foster innovation through targeted automation. By deploying scalable AI solutions to existing time‑consuming workflows with high potential for AI disruption, this will ensure measurable, sustainable benefits across the Global Data Office.
Responsibilities
- Design and implement agentic AI workflows to automate multi‑step tasks and drive business impact.
- Integrate predictive, generative, and prescriptive AI models into enterprise processes for decision support and efficiency gains.
- Apply ML, deep learning, and NLP techniques to diverse datasets, building scalable, secure data pipelines for AI training, inference, and monitoring.
- Collaborate with product managers, engineers, and domain experts to embed AI solutions into operations.
- Define, track, and report KPIs to measure productivity improvements, cost savings, and accuracy gains.
- Validate AI impact through experimentation frameworks such as A/B testing and performance benchmarking.
- Document workflows, models, and processes to ensure knowledge sharing and adherence to best practices.
- Stay current on emerging AI frameworks and LLM‑based automation, prototyping innovative solutions for rapid adoption.
- Communicate complex technical concepts clearly to technical and non‑technical stakeholders, fostering cross‑functional collaboration.
This is a hybrid position. Expectation of days in the office will be confirmed by your Hiring Manager.
Relocation assistance is not provided for this role.
Basic Qualifications
- 2 or more years of work experience with a Bachelor’s Degree or an Advanced Degree (e.g. Masters, MBA, JD, MD, or PhD).
Preferred Qualifications
- 3 or more years of work experience with a Bachelor’s Degree or more than 2 years of work experience with an Advanced Degree (e.g. Masters, MBA, JD, MD).
- 2+ years of hands‑on work experience with process/workflow automation and experience deploying Agentic AI solutions.
- Advanced Degree with specialization in AI, Computer Science, Data Science, Engineering, Statistics or a highly quantitative field.
- Strong technical proficiency in machine learning and AI frameworks, including TensorFlow, PyTorch, scikit‑learn, and Hugging Face Transformers.
- Experience with agentic AI and orchestration tools such as LangChain, LlamaIndex, or similar frameworks for multi‑step task automation.
- Solid data engineering skills, including SQL, Spark, Databricks, Airflow, Kafka, and ETL/ELT pipeline development.
- Proficiency in Python (primary) and familiarity with Java, Scala, or R.
- Experience with cloud and MLOps practices, including CI/CD, model monitoring, retraining pipelines, and containerization (Docker, Kubernetes).
Work Authorization: Permanent Authorization to work in the U.S. is a precondition of employment for this position. Visa will not sponsor applicants for work visas in connection with this position.
Work Hours: Varies upon the needs of the department.
Travel Requirements: This position requires travel 5‑10% of the time.
Mental/Physical Requirements: This position will be performed in an office setting. The position will require the incumbent to sit and stand at a desk, communicate in person and by telephone, frequently operate standard office equipment, such as telephones and computers.
Visa is an EEO Employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability or protected veteran status. Visa will also consider for employment qualified applicants with criminal histories in a manner consistent with EEOC guidelines and applicable local law.
Visa will consider for employment qualified applicants with criminal histories in a manner consistent with applicable local law, including the requirements of Article 49 of the San Francisco Police Code.
U.S. APPLICANTS ONLY: The estimated salary range for a new hire into this position is 137,400.00 to 193,750.00 USD per year, which may include potential sales incentive payments (if applicable). Salary may vary depending on job‑related factors which may include knowledge, skills, experience, and location. In addition, this position may be eligible for bonus and equity. Visa has a comprehensive benefits package for which this position may be eligible that includes Medical, Dental, Vision, 401(k), FSA/HSA, Life Insurance, Paid Time Off, and Wellness Program.
#J-18808-Ljbffr
Job Description Summary
We are seeking a highly skilled and experienced Senior Data Engineer to join our dynamic team. In this role, you will be instrumental in designing, building, and maintaining robust and scalable data pipelines and solutions within the Microsoft Azure ecosystem. You will be responsible for developing and optimizing ETL/ELT processes, ensuring data quality, and enabling efficient data access for analytics and business intelligence. We are looking for a hands-on engineer who thrives in a fast-paced environment and is passionate about leveraging cutting-edge technologies
Key Responsibilities:
Design, develop, and maintain cloud-based data pipelines and ETL/ELT workflows.
Build and optimize data architectures to support structured and unstructured data processing.
Collaborate with data analysts, data scientists, and business stakeholders to understand data needs.
Implement data quality, security, and governance best practices.
Monitor and troubleshoot data workflows to ensure high availability and performance.
Optimize database and data storage solutions for performance and cost efficiency.
Contribute to cloud adoption, migration, and modernization initiatives.
Mandatory Skills:
Strong expertise with Azure cloud platform.
Strong experience in Databricks
Azure Data Factory proficiency required; building datasets, data flows, and pipelines in ADF (not just maintaining something already built)
Hands-on experience with ETL/ELT tools and frameworks.
Proficiency in SQL, Python, and data modeling.
Knowledge of CI/CD pipelines and infrastructure-as-code tools.
Understanding of data governance, security, and compliance.
Preferred Skills:
Exposure to API integration and microservices architecture.
Strong analytical and problem-solving skills.
Azure cloud certifications and/or past experience
AKS (Azure Kubernetes Service) experience, and ETL related to applications containerized & deployed on AKS (or EKS)
About Wakefern
Wakefern Food Corp. is the largest retailer-owned cooperative in the United States and supports its co-operative members' retail operations, trading under the ShopRite®, Price Rite®, The Fresh Grocer®, Dearborn Markets®, and Gourmet Garage® banners.
Employing an innovative approach to wholesale business services, Wakefern focuses on helping the independent retailer compete in a big business world. Providing the tools entrepreneurs need to stay a step ahead of the competition, Wakefern’s co-operative members benefit from the company’s extensive portfolio of services, including innovative technology, private label development, and best-in-class procurement practices.
The ideal candidate will have a strong background in designing, developing, and implementing complex projects, with focus on automating data processes and driving efficiency within the organization. This role requires a close collaboration with application developers, data engineers, data analysts, data scientists to ensure seamless data integration and automation across various platforms. The Data Integration & AI Engineer is responsible for identifying opportunities to automate repetitive data processes, reduce manual intervention, and improve overall data accessibility.
Essential Functions
- Participate in the development life cycle (requirements definition, project approval, design, development, and implementation) and maintenance of the systems.
- Implement and enforce data quality and governance standards to ensure the accuracy and consistency.
- Provide input for project plans and timelines to align with business objectives.
- Monitor project progress, identify risks, and implement mitigation strategies.
- Work with cross-functional teams and ensure effective communication and collaboration.
- Provide regular updates to the management team.
- Follow the standards and procedures according to Architecture Review Board best practices, revising standards and procedures as requirements change and technological advancements are incorporated into the >tech_ structure.
- Communicates and promotes the code of ethics and business conduct.
- Ensures completion of required company compliance training programs.
- Is trained – either through formal education or through experience – in software / hardware technologies and development methodologies.
- Stays current through personal development and professional and industry organizations.
Responsibilities
- Design, build, and maintain automated data pipelines and ETL processes to ensure scalability, efficiency, and reliability across data operations.
- Develop and implement robust data integration solutions to streamline data flow between diverse systems and databases.
- Continuously optimize data workflows and automation processes to enhance performance, scalability, and maintainability.
- Design and develop end-to-end data solutions utilizing modern technologies, including scripting languages, databases, APIs, and cloud platforms.
- Ensure data solutions and data sources meet quality, security, and compliance standards.
- Monitor and troubleshoot automation workflows, proactively identifying and resolving issues to minimize downtime.
- Provide technical training, documentation, and ongoing support to end users of data automation systems.
- Prepare and maintain comprehensive technical documentation, including solution designs, specifications, and operational procedures.
Qualifications
- A bachelor's degree or higher in computer science, information systems, or a related field.
- Hands-on experience with cloud data platforms (e.g., GCP, Azure, etc.)
- Strong knowledge and skills in data automation technologies, such as Python, SQL, ETL/ELT tools, Kafka, APIs, cloud data pipelines, etc.
- Experience in GCP BigQuery, Dataflow, Pub/Sub, and Cloud storage.
- Experience with workflow orchestration tools such as Cloud Composer or Airflow
- Proficiency in iPaaS (Integration Platform as a Service) platforms, such as Boomi, SAP BTP, etc.
- Develop and manage data integrations for AI agents, connecting them to internal and external APIs, databases, and knowledge sources to expand their capabilities.
- Build and maintain scalable Retrieval-Augmented Generation (RAG) pipelines, including the curation and indexing of knowledge bases in vector databases (e.g., Pinecone, Vertex AI Vector Search).
- Leverage cloud-based AI/ML platforms (e.g., Vertex AI, Azure ML) to build, train, and deploy machine learning models on a scale.
- Establish and enforce data quality and governance standards for AI/ML datasets, ensuring the accuracy, completeness, and integrity of data used for model training and validation.
- Collaborate closely with data scientists and machine learning engineers to understand data requirements and deliver optimized data solutions that support the entire machine learning lifecycle.
- Hands-on experience with IBM DataStage and Alteryx is a plus.
- Strong understanding of database design principles, including normalization, indexing, partitioning, and query optimization.
- Ability to design and maintain efficient, scalable, and well-structured database schemas to support both analytical and transactional workloads,
- Familiarity with BI visualization tools such as MicroStrategy, Power BI, Looker, or similar.
- Familiarity with data modeling tools.
- Familiarity with DevOps practices for data (CI/CD pipelines)
- Proficiency in project management software (e.g., JIRA, Clarizen, etc.)
- Familiarity with DevOps practices for data (CI/CD pipelines)
- Strong knowledge and skills in data management, data quality, and data governance.
- Strong communication, collaboration, and problem-solving skills.
- Ability to work on multiple projects and prioritize tasks effectively.
- Ability to work independently and in a team environment.
- Ability to learn new technologies and tools quickly.
- The ability to handle stressful situations.
- Highly developed business acuity and acumen.
- Strong critical thinking and decision-making skills.
Working Conditions & Physical Demands
This position requires in-person office presence at least 4x a week.
Compensation and Benefits
The salary range for this position is $75,868 - $150,644. Placement in the range depends on several factors, including experience, skills, education, geography, and budget considerations.
Wakefern is proud to offer a comprehensive benefits package designed to support the health, well-being, and professional development of our Associates. Benefits include medical, dental, and vision coverage, life and disability insurance, a 401(k) retirement plan with company match & annual company contribution, paid time off, holidays, and parental leave.
Associates also enjoy access to wellness and family support programs, fitness reimbursement, educational and training opportunities through our corporate university, and a collaborative, team-oriented work environment. Many of these benefits are fully or partially funded by the company, with some subject to eligibility requirements
Senior P&C Insurance Data Scientist opening in Atlanta.
Lead team of data analysts and data engineers in designing/maintaining Machine Learning and AI solutions; collaborate with IT, Legal, and Claims teams to identify new analytical projects to increase profitability; gather data from internal stakeholders using Python, SQL, Tableau, and Power BI.
Ideal candidate has 8+ years of Data Science/Predictive Modeling experience in a P&C insurance setting, including extensive background in Python, SQL, and Git.
(PR13082)