Data Migration Vs Database Migration Jobs in Usa
9,489 positions found — Page 2
The ideal candidate will be responsible for accurately entering, updating, and maintaining data in our computer systems and databases.
This role requires strong attention to detail, basic computer skills, and the ability to work efficiently.
Key Responsibilities: Enter and update data into company databases and systems Verify the accuracy of data before entering it Maintain organized records and files Review data for errors and correct any inconsistencies Generate reports when required Maintain confidentiality of sensitive information Requirements: High school diploma or equivalent Basic computer skills (MS Word, Excel, and data management systems) Good typing speed and accuracy Strong attention to detail Ability to work independently and meet deadlines
Your role and responsibilities
About the Opportunity
IBM Consulting is seeking an accomplished Data & Analytics Associate Partner to accelerate our growth within the Industrial & Communications sectors. This executive role is responsible for shaping client vision, cultivating senior executive relationships, and developing data-driven solutions that enable clients to successfully navigate complex transformation programs.
You will bring together deep industry expertise and IBM’s portfolio of data, analytics, and AI capabilities to help organizations modernize their data ecosystems—migrating from legacy platforms to modern hybrid cloud architectures—while adopting next-generation analytics, GenAI, and agentic AI to strengthen decision-making and deliver measurable business and financial outcomes.
This role is ideal for a seasoned leader who integrates industry depth, consulting excellence, and technical thought leadership, has a strong understanding of competitive market dynamics, and consistently delivers high-impact transformation at scale.
Key Responsibilities
Market Leadership & Growth
Expand IBM’s Data & Analytics presence by identifying new market opportunities, developing differentiated solutions, and building a strong pipeline.
Engage senior client executives to understand strategic priorities and shape data transformation roadmaps aligned to their business and financial goals.
Lead end-to-end sales cycles, including solution definition, proposal leadership, financial structuring, and contract negotiation.
Strategic Advisory & Transformation Delivery
Advise C-suite leaders on strategies to their data estate modernization, advanced analytics, GenAI, and agentic AI to drive business performance.
Architect integrated solutions that include:
Migration from legacy data platforms to modern cloud-based architectures
Data engineering and Information governance
Business intelligence and advanced analytics
GenAI-powered and agentic AI-driven automation and decisioning
Lead complex transformation programs from discovery through delivery, ensuring measurable outcomes and client satisfaction.
Engagement Excellence & Financial Stewardship
Oversee multi-disciplinary delivery teams to ensure high-quality, consistent execution across all program phases.
Manage engagement financials, including forecasting, margin performance, and overall portfolio profitability.
Align right client technologies, industry expertise, and global delivery capabilities to maximize client value.
Practice Building & Talent Development
Recruit, mentor, and grow top-tier consultants, architects, and data specialists.
Build and scale capabilities in data modernization, cloud data engineering, analytics, GenAI, and emerging agentic AI techniques.
Contribute to practice strategy, offering development, and capability growth across the global Data & Analytics team.
Thought Leadership & Market Presence
Stay ahead of sector and technology trends, including cloud modernization, GenAI, agentic system design, regulatory changes, and evolving competitive dynamics.
Represent IBM at industry conferences, client events, webinars, and executive roundtables.
Create original thought leadership—articles, perspectives, point-of-views—that positions IBM as a leading advisor in data and AI-driven transformation.
This position can be preformed anywhere in the US.
"Leaders are expected to spend time with their teams and clients and therefore are generally expected to be in the workplace a minimum of three days a week, subject to business needs."
Required technical and professional expertise
Qualifications
12+ years of experience in consulting, data strategy, analytics, or digital transformation, with strong exposure to the Industrial or Communications sectors.
Hands-on experience modernizing data ecosystems, including migrating from legacy on-premise platforms to modern cloud-native or hybrid cloud architectures.
Deep expertise with major cloud platforms and their data/analytics stacks, including implementation experience with:
AWS (e.g., Redshift, S3, Glue, EMR, Athena, Lake Formation, Bedrock, SageMaker)
Microsoft Azure (e.g., Azure Data Lake, Synapse, Data Factory, Databricks on Azure, Fabric, Cognitive Services)
Google Cloud Platform (e.g., BigQuery, Cloud Storage, Dataflow, Dataproc, Vertex AI)
Experience designing and implementing end-to-end data pipelines, governance frameworks, and analytics solutions on one or more of these platforms.
Strong understanding of GenAI architectures, LLM integration patterns, vector databases, retrieval-augmented generation (RAG), and emerging agentic AI frameworks.
Proven track record of selling, structuring, and delivering large-scale data and AI transformation programs.
Robust technical and functional expertise in data engineering, cloud data platforms, analytics, AI/ML, information management, and governance.
Executive-level communication and presence, with demonstrated ability to influence senior stakeholders and convey complex topics through compelling narratives.
Financial management experience, including engagement economics, forecasting, margin optimization, and portfolio profitability.
Demonstrated leadership in building, scaling, and developing high-performing consulting and technical teams.
Preferred technical and professional experience
IBM is committed to creating a diverse environment and is proud to be an equal-opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, gender, gender identity or expression, sexual orientation, national origin, caste, genetics, pregnancy, disability, neurodivergence, age, veteran status, or other characteristics. IBM is also committed to compliance with all fair employment practices regarding citizenship and immigration status.
#J-18808-Ljbffr
Data Analytics Internship
Los Angeles, CA, USA (Hybrid role)
Part-Time, $17.87/hr, Mid-April 2026 to Mid-August 2026.
DailyLook, a subsidiary of Victoria’s Secret & Co. (NYSE: VSCO) since being acquired in December 2022, is seeking a Data Analytics Intern. This internship offers the opportunity to work across 2 key teams at Dailylook: Demand Planning and Data Growth. The intern will be at the core of the business, leveraging data and analytics to support strategic initiatives and help drive data-informed improvements across operations, inventory planning, and growth initiatives. This is a great chance to gain hands-on experience working with real business data while contributing to impactful decisions!
Qualifications for the Position
- A degree in (or a junior, senior or graduate student pursing a degree in): data science, statistics, computer science, economics (quantitative track), applied analytics, mathematics or business analytics.
- GPA 3.3+ preferred
- Coursework or experience in: Statisical analysis, data analytics, machine learning.
- Experience with database systems, SQL and Python
- Familiarity with BI tools such as Looker or Tableau.
- Exemplary interpersonal communication skills both verbal and written
- Highly motivated, collaborative
- Experience in a Startup or Retail industry is an extra plus!
- An intellectually curious team player with a no-compromises approach to work quality, attention to detail, organization, and the ability to manage multiple priorities and projects in a fast-paced environment
- Self-motivated, detail-oriented, hands-on go-getter with the ability to build and suggest overhaul processes where needed, take initiative, work independently and proactively, multi-task, and remain flexible with changing priorities
- “I’ll find a way!” mindset where you can leverage your autonomy within your role to think outside the box.
- Demonstrated ability to communicate and collaborate effectively across global teams by adapting to diverse cultural norms, respecting time zone differences, and leveraging digital collaboration tools to maintain alignment and productivity
- Skilled in building trust and fostering inclusive communication styles that support clarity, empathy, and shared goals in international work environments
- Ability and willingness to work on-site at our office in Downtown LA at least once a week.
Responsibilities
- Reports to the Planning Team.
- Maintain and migrate existing demand planning and inventory reports to the current BI tool.
- Build and update weekly and monthly dashboards covering product performance, box performance, and styling metrics
- Assist in developing demand planning assumptions and forecasting frameworks (style demand, size curves, inventory flow)
- Build basic planning tools in Google Sheets / BI tools to support: Size curve projections & Product lifecycle tracking
- Conduct assortment and scenario analysis to support predictive demand planning
- Analyze inventory health, sell-through trends, and replenishment opportunities
- Identify optimization opportunities within the current planning workflow and BI infrastructure
- Document demand planning processes and support improvements to internal planning tools.
- Support the team in analyzing marketing and subscription performance, including acquisition, traffic/funnel, CRM, engagement, etc.
- Support migration and setup of analytics tools and platforms to improve tracking of user behavior and marketing performance
- Assist with dashboard updates, reporting, and basic data checks to ensure data quality
- Help monitor A/B tests and experiments for CRM campaigns and website initiatives
- Conduct ad-hoc analyses to provide insights and recommendations for the team
- Document data workflows & the new data infrastructure.
Compensation & Benefits
The pay for this position is $17.87 an hour. This is a non-exempt, part-time position.
DailyLook is proud to provide equal opportunity to all employees and qualified applicants without regard to race, color, religion, national origin or citizenship, age, sex, marital status, ancestry, legally protected physical or mental disability, veteran status, gender identity, sexual orientation or any other basis protected under applicable law.
By applying for this position, the applicant authorizes DailyLook to check all references list on your application and/or resume.
The University of Maryland (UMD) seeks a Manager of Data Analytics Enablement to lead the adoption and modernization of enterprise analytics capabilities that enable trusted, data-informed decision-making across campus.
This is an exciting time to join UMD as we advance enterprise data and analytics through a period of innovative growth and modernization.
This role will play a key part in shaping the future of enterprise business intelligence, advancing Microsoft Power BI and Fabric capabilities, and embedding sustainable data quality and stewardship practices into analytics workflows.
Reporting to the Director of Enterprise Data Services, this position partners with institutional leaders, IT teams, and enterprise stakeholders to deliver reliable data products, consistent metrics, and actionable insights.
The manager will lead a team of data professionals and advance practical, operational governance practices that support trusted analytics and long-term institutional impact.
Key Responsibilities: Lead the strategy, development, and continuous improvement of the university’s enterprise business intelligence environment, including Microsoft Power BI and Microsoft Fabric.
Establish standards, best practices, and architectural patterns for semantic models, dashboards, and analytics delivery.
Guide migration and modernization efforts to ensure scalable, secure, and high-performing analytics solutions.
Develop and manage an analytics intake, prioritization, and delivery framework aligned with institutional priorities.
Define and implement data quality monitoring practices to ensure reliability, accuracy, and consistency of enterprise data assets.
Partner with technical teams to embed validation, monitoring, and observability into data pipelines and lakehouse environments.
Promote consistent metric definitions and collaborate with campus stakeholders to clarify data ownership and stewardship roles.
Support adoption of metadata management, data catalog, and lineage capabilities.
Ensure analytics solutions align with university standards for security, privacy, and responsible data use.
Manage, mentor, and develop a team of analytics and data professionals, fostering a culture of quality, collaboration, and service.
Communicate analytics priorities, progress, and impact to leadership and campus partners.
**This position is considered essential and may be required to work at the normal work location or an alternative location during a major catastrophic event, weather emergency, or other operational emergency to help maintain the continuity of University services.
** **May be required to work evenings, nights, weekends, or different shifts for extended periods.
** KNOWLEDGE, SKILLS, & ABILITIES: Knowledge of data privacy and security principles and practices necessary to protect systems and data from threats.
Knowledge in areas of subject matter expertise such as databases, data modeling, ETL, reporting, data governance practices, metadata management, data stewardship, and/or regulatory compliance.
Skill in SQL or programming/scripting languages (e.g.; Python) used for integrations, data pipelines, report development, and data management.
Skill in adapting communication style to different audiences, including technical, business, and executive stakeholders.
Skill in the use of office productivity software such as Office 365 or Google Workspaces.
Ability to lead presentations and training for large groups.
Ability to manage communications and relationships with technical and business stakeholders.
Ability to collaborate effectively with other Managers, Assistant Directors, and Directors to identify and solve problems, make improvements, and address ongoing issues.
Ability to provide a team with effective direction and support in implementations using standards and techniques that lead to a repeatable and reliable solution.
Ability to ensure documentation standards and procedures are implemented for all team responsibilities.
Ability to define deadlines and manage the quality of the work delivered.
Ability to comprehend and handle interpersonal dynamics, demonstrate empathy towards team members, and effectively manage conflicts or challenging circumstances.
Ability to coach and mentor team members in order to enhance their performance, provide constructive feedback, and support skill development.
Physical Demands: Sedentary work.
Exerting up to 10 pounds of force occasionally and/or negligible amount of force frequently or constantly to lift, carry, push, pull or otherwise move objects.
Repetitive motion.
Substantial movements (motions) of the wrists, hands, and/or fingers.
The worker is required to have close visual acuity to perform an activity such as: preparing and analyzing data and figures; transcribing; viewing a computer terminal; extensive reading.
Minimum Qualifications Education: Bachelor’s degree from an accredited college or university.
Experience: Three (3) years of professional experience supporting the operations, maintenance, and administration of data systems, analytics platforms, or data management programs.
One (1) year leading or supervising professional staff.
Other: Additional work experience as defined above may be substituted on a year for year basis for up to four (4) years of the required education.
Preferences: Demonstrated experience leading business intelligence or enterprise analytics initiatives.
Experience managing or mentoring data professionals in a collaborative team environment.
Strong experience with Power BI and modern data platforms such as Microsoft Fabric, Databricks, or similar cloud-based analytics ecosystems.
Proficiency with SQL and/or Python in support of analytics, data modeling, or data quality initiatives.
Experience implementing or advancing data quality practices, including validation, monitoring, or metric standardization.
Experience supporting practical data governance activities such as establishing shared definitions, coordinating data stewardship, or implementing metadata/catalog tools.
Demonstrated ability to collaborate across diverse stakeholders and translate business needs into scalable analytics solutions.
Strong communication skills with the ability to engage both technical and non-technical audiences.
Experience using Jira or similar tools for work intake, project tracking, and prioritization.
Additional Information: Please note that all positions within the Division of Information Technology (DIT) have an in person component with expected time in our College Park, MD location per week.
Telework is not a guaranteed work arrangement.
Visa Sponsorship Information: DIT will not sponsor the successful candidate for work authorization in the United States now or in the future.
F1 STEM OPT support is not available for this position.
Required Application Materials: Resume, Cover Letter, List of three References Best Consideration Date: March 26, 2026 Open Until Filled: Yes Salary Range: $149,120.00
- $178,944.00 Please apply at: Job Risks: Not Applicable to This Position Financial Disclosure Required: No For more information on Financial Disclosure, please visit Maryland's State Ethics Commission website .
Department: DIT-EE-Enterprise Data Services Worker Sub-Type: Staff Regular Benefits Summary: For more information on Regular Exempt benefits, select this link .
Background Checks: Offers of employment are contingent on completion of a background check.
Information reported by the background check will not automatically disqualify anyone from employment.
Before any adverse decision, the finalist will have an opportunity to provide information to the University regarding disclosable background check information.
The University reserves the right to rescind the offer of employment or otherwise decline or terminate employment if the information reported by the background check is deemed incompatible with the position, regardless of when the background check is completed.
Employment Eligibility: The successful candidate must complete employment eligibility verification (on Form I-9) by presenting documents that establish identity and work authorization within the timeframe required by federal immigration law, and where applicable, to demonstrate renewed employment authorization.
Failure to complete employment eligibility verification or reverification within the timeframe set forth by law may result in suspension or termination of employment.
EEO Statement : The University of Maryland, College Park is an Equal Opportunity Employer.
All qualified applicants will receive equal consideration for employment.
Please read the University’s Equal Employment Opportunity Statement of Policy.
Title IX Non-Discrimination Notice See above description for requirements
Job Description Summary
We are seeking a highly skilled and experienced Senior Data Engineer to join our dynamic team. In this role, you will be instrumental in designing, building, and maintaining robust and scalable data pipelines and solutions within the Microsoft Azure ecosystem. You will be responsible for developing and optimizing ETL/ELT processes, ensuring data quality, and enabling efficient data access for analytics and business intelligence. We are looking for a hands-on engineer who thrives in a fast-paced environment and is passionate about leveraging cutting-edge technologies
Key Responsibilities:
Design, develop, and maintain cloud-based data pipelines and ETL/ELT workflows.
Build and optimize data architectures to support structured and unstructured data processing.
Collaborate with data analysts, data scientists, and business stakeholders to understand data needs.
Implement data quality, security, and governance best practices.
Monitor and troubleshoot data workflows to ensure high availability and performance.
Optimize database and data storage solutions for performance and cost efficiency.
Contribute to cloud adoption, migration, and modernization initiatives.
Mandatory Skills:
Strong expertise with Azure cloud platform.
Strong experience in Databricks
Azure Data Factory proficiency required; building datasets, data flows, and pipelines in ADF (not just maintaining something already built)
Hands-on experience with ETL/ELT tools and frameworks.
Proficiency in SQL, Python, and data modeling.
Knowledge of CI/CD pipelines and infrastructure-as-code tools.
Understanding of data governance, security, and compliance.
Preferred Skills:
Exposure to API integration and microservices architecture.
Strong analytical and problem-solving skills.
Azure cloud certifications and/or past experience
AKS (Azure Kubernetes Service) experience, and ETL related to applications containerized & deployed on AKS (or EKS)
Purpose
The IT Database Engineer is responsible for designing, implementing, and supporting relational database platforms in both traditional data centers and Azure cloud environments. The role covers installation, configuration, performance tuning, high availability, backup and recovery, monitoring, and incident response for Microsoft SQL Server, MySQL, and PostgreSQL, with participation in an on-call rotation to support mission-critical workloads.
Key Responsibilities
- Install, configure, and upgrade MSSQL, MySQL, and PostgreSQL in data center and Azure environments (IaaS and/or PaaS as applicable).
- Perform day-to-day database administration, including user and role management, permissions, schema changes, and maintenance tasks.
- Monitor database health, performance, and capacity using native and third-party tools; define meaningful alerts and dashboards for proactive issue detection.
- Troubleshoot database incidents (performance issues, blocking/deadlocks, failed jobs, connectivity problems, resource constraints) and drive root-cause analysis and permanent fixes.
- Design, implement, and maintain backup and recovery strategies (full/diff/log, PITR, snapshots, Azure backup options) and regularly test restore procedures.
- Implement and support high availability and disaster recovery configurations (e.g., SQL Server Always On, failover clustering, log shipping, MySQL/Postgres replication, Azure availability sets/zones).
- Optimize database performance through indexing strategies, query tuning, statistics management, and configuration tuning at both OS and database levels.
- Implement and enforce security controls (authentication, authorization, encryption at rest/in transit, auditing) aligned with organizational and regulatory requirements.
- Support application and development teams with database design, query optimization, and controlled deployment of schema changes across environments.
- Maintain detailed documentation including runbooks, standards, topology diagrams, data flows, and operational procedures for both on-prem and Azure deployments.
- Participate in an on-call rotation, responding to after-hours incidents, and perform planned maintenance during maintenance windows.
- Automate routine tasks (provisioning, checks, patching, reporting) using scripts and tooling (e.g., T-SQL, PowerShell, Bash, Python, Azure CLI).
Required Qualifications
- Proven experience as a Database Engineer/DBA supporting MSSQL, MySQL, and PostgreSQL in production environments.
- Hands-on experience managing databases in traditional data centers (physical/virtual servers) and Azure (e.g., SQL Server on Azure VMs, Azure SQL Database, Azure Database for MySQL/PostgreSQL or similar).
- Strong understanding of database internals: storage structures, indexing, transactions, isolation levels, and locking.
- Demonstrated skills in performance troubleshooting and tuning using execution plans, wait statistics, and monitoring metrics.
- Practical experience with HA/DR solutions and backup/restore strategies, including testing and documentation of failover/recovery procedures.
- Proficiency with scripting/automation for database operations and integration with operational tooling.
- Familiarity with networking, OS, and virtualization concepts relevant to database performance and connectivity (subnets, firewalls, load balancers, storage latency).
- Solid understanding of security best practices for databases.
Preferred Qualifications
- Experience with Azure-native monitoring and management tools (e.g., Azure Monitor, Log Analytics, Alerts, Managed Identities, Key Vault).
- Experience with CI/CD and database change automation, including schema versioning and deployment pipelines.
- Exposure to large-scale or high-volume databases, partitioning, and scaling strategies (vertical/horizontal).
- Knowledge of regulatory and compliance requirements related to data (e.g., PCI, HIPAA, GDPR) and data protection techniques (masking, tokenization).
- Relevant certifications (e.g., Microsoft Azure, SQL Server, MySQL, PostgreSQL).
Soft Skills
- Strong analytical and problem-solving skills, especially under time pressure during incidents and on-call situations.
- Clear communication skills to work effectively with developers, infrastructure teams, security, and business stakeholders.
- High sense of ownership for data integrity, availability, and reliability, with a structured approach to documentation and process.
Working Conditions (travel, hours, environment)
- Limited travel required including air and car travel
- While performing the duties of this job, the employee is occasionally exposed to a warehouse environment and moving vehicles. The noise level in the work environment is typically quiet to moderate.
Physical/Sensory Requirements
Sedentary Work – Ability to exert 10 - 20 pounds of force occasionally, and/or negligible amount of force frequently to lift, carry, push, pull or otherwise move objects. Sedentary work involves sitting most of the time but may involve walking or standing for brief periods of time.
Benefits & Rewards
- Bonus opportunities at every level
- Non-traditional retail hours (we close at 7p!)
- Career advancement opportunities
- Relocation opportunities across the country
- 401k with discretionary company match
- Employee Stock Purchase Plan
- Referral Bonus Program
- 80 hrs. annualized paid vacation (full-time associates)
- 4 paid holidays per year (full-time hourly store associates only)
- 1 paid personal holiday of associate’s choice and Volunteer Time Off program
- Medical, Dental, Vision, Life and other Insurance Plans (subject to eligibility criteria)
Equal Employment Opportunity
Floor & Decor provides equal employment opportunities to all associates and applicants without regard to age, race, color, religion or creed, national origin or ancestry, sex (including pregnancy), sexual orientation, gender, gender identity, disability, veteran status, genetic information, ethnicity, citizenship, or any other category protected by law.
This policy applies to all areas of employment, including recruitment, testing, screening, hiring, selection for training, upgrading, transfer, demotion, layoff, discipline, termination, compensation, benefits and all other privileges, terms and conditions of employment. This policy and the law prohibit employment discrimination against any associate or applicant on the basis of any legally protected status outlined above.
About Wakefern
Wakefern Food Corp. is the largest retailer-owned cooperative in the United States and supports its co-operative members' retail operations, trading under the ShopRite®, Price Rite®, The Fresh Grocer®, Dearborn Markets®, and Gourmet Garage® banners.
Employing an innovative approach to wholesale business services, Wakefern focuses on helping the independent retailer compete in a big business world. Providing the tools entrepreneurs need to stay a step ahead of the competition, Wakefern’s co-operative members benefit from the company’s extensive portfolio of services, including innovative technology, private label development, and best-in-class procurement practices.
The ideal candidate will have a strong background in designing, developing, and implementing complex projects, with focus on automating data processes and driving efficiency within the organization. This role requires a close collaboration with application developers, data engineers, data analysts, data scientists to ensure seamless data integration and automation across various platforms. The Data Integration & AI Engineer is responsible for identifying opportunities to automate repetitive data processes, reduce manual intervention, and improve overall data accessibility.
Essential Functions
- Participate in the development life cycle (requirements definition, project approval, design, development, and implementation) and maintenance of the systems.
- Implement and enforce data quality and governance standards to ensure the accuracy and consistency.
- Provide input for project plans and timelines to align with business objectives.
- Monitor project progress, identify risks, and implement mitigation strategies.
- Work with cross-functional teams and ensure effective communication and collaboration.
- Provide regular updates to the management team.
- Follow the standards and procedures according to Architecture Review Board best practices, revising standards and procedures as requirements change and technological advancements are incorporated into the >tech_ structure.
- Communicates and promotes the code of ethics and business conduct.
- Ensures completion of required company compliance training programs.
- Is trained – either through formal education or through experience – in software / hardware technologies and development methodologies.
- Stays current through personal development and professional and industry organizations.
Responsibilities
- Design, build, and maintain automated data pipelines and ETL processes to ensure scalability, efficiency, and reliability across data operations.
- Develop and implement robust data integration solutions to streamline data flow between diverse systems and databases.
- Continuously optimize data workflows and automation processes to enhance performance, scalability, and maintainability.
- Design and develop end-to-end data solutions utilizing modern technologies, including scripting languages, databases, APIs, and cloud platforms.
- Ensure data solutions and data sources meet quality, security, and compliance standards.
- Monitor and troubleshoot automation workflows, proactively identifying and resolving issues to minimize downtime.
- Provide technical training, documentation, and ongoing support to end users of data automation systems.
- Prepare and maintain comprehensive technical documentation, including solution designs, specifications, and operational procedures.
Qualifications
- A bachelor's degree or higher in computer science, information systems, or a related field.
- Hands-on experience with cloud data platforms (e.g., GCP, Azure, etc.)
- Strong knowledge and skills in data automation technologies, such as Python, SQL, ETL/ELT tools, Kafka, APIs, cloud data pipelines, etc.
- Experience in GCP BigQuery, Dataflow, Pub/Sub, and Cloud storage.
- Experience with workflow orchestration tools such as Cloud Composer or Airflow
- Proficiency in iPaaS (Integration Platform as a Service) platforms, such as Boomi, SAP BTP, etc.
- Develop and manage data integrations for AI agents, connecting them to internal and external APIs, databases, and knowledge sources to expand their capabilities.
- Build and maintain scalable Retrieval-Augmented Generation (RAG) pipelines, including the curation and indexing of knowledge bases in vector databases (e.g., Pinecone, Vertex AI Vector Search).
- Leverage cloud-based AI/ML platforms (e.g., Vertex AI, Azure ML) to build, train, and deploy machine learning models on a scale.
- Establish and enforce data quality and governance standards for AI/ML datasets, ensuring the accuracy, completeness, and integrity of data used for model training and validation.
- Collaborate closely with data scientists and machine learning engineers to understand data requirements and deliver optimized data solutions that support the entire machine learning lifecycle.
- Hands-on experience with IBM DataStage and Alteryx is a plus.
- Strong understanding of database design principles, including normalization, indexing, partitioning, and query optimization.
- Ability to design and maintain efficient, scalable, and well-structured database schemas to support both analytical and transactional workloads,
- Familiarity with BI visualization tools such as MicroStrategy, Power BI, Looker, or similar.
- Familiarity with data modeling tools.
- Familiarity with DevOps practices for data (CI/CD pipelines)
- Proficiency in project management software (e.g., JIRA, Clarizen, etc.)
- Familiarity with DevOps practices for data (CI/CD pipelines)
- Strong knowledge and skills in data management, data quality, and data governance.
- Strong communication, collaboration, and problem-solving skills.
- Ability to work on multiple projects and prioritize tasks effectively.
- Ability to work independently and in a team environment.
- Ability to learn new technologies and tools quickly.
- The ability to handle stressful situations.
- Highly developed business acuity and acumen.
- Strong critical thinking and decision-making skills.
Working Conditions & Physical Demands
This position requires in-person office presence at least 4x a week.
Compensation and Benefits
The salary range for this position is $75,868 - $150,644. Placement in the range depends on several factors, including experience, skills, education, geography, and budget considerations.
Wakefern is proud to offer a comprehensive benefits package designed to support the health, well-being, and professional development of our Associates. Benefits include medical, dental, and vision coverage, life and disability insurance, a 401(k) retirement plan with company match & annual company contribution, paid time off, holidays, and parental leave.
Associates also enjoy access to wellness and family support programs, fitness reimbursement, educational and training opportunities through our corporate university, and a collaborative, team-oriented work environment. Many of these benefits are fully or partially funded by the company, with some subject to eligibility requirements
Loloi Rugs is a leading textile brand that designs and crafts rugs, pillows, and throws for the thoughtfully layered home. Family-owned and led since 2004, Loloi is growing more quickly than ever. To date, we’ve expanded our diverse team to hundreds of employees, invested in multiple distribution facilities, introduced thousands of products, and earned the respect and business of retailers and designers worldwide. A testament to our products and our team, Loloi has earned the ARTS Award for “Best Rug Manufacturer” in 2010, 2011, 2015, 2016, 2018, 2023, and 2025.
Security Advisory: Beware of Frauds
Protect yourself from potential fraud and verify the authenticity of any job offer you receive from Loloi. Rest assured that we never request payment or demand any sensitive personal information, such as bank details or social security numbers, at any stage of the recruiting process. To ensure genuine communication, our recruiters will solely reach out to applicants using an @ email address. Your security is of paramount importance to us at Loloi, and we are committed to maintaining a safe and trustworthy hiring experience for all candidates.
We are building a Business Operations Center of Excellence, and we need a Product Data Analyst to serve as the "Guardian of the Golden Record." In this role, you are the absolute owner of product data integrity as it relates to the digital customer experience. You ensure that every item we sell is accurately represented across every touchpoint—from our ERP and PIM to our website storefront and marketing feeds. This is not a data entry role; it is a high-impact technical logic and investigation role. You will work directly with our Data Platform and Software Engineering teams to define business rules, audit data health via complex SQL, and troubleshoot data transmission errors before they impact the customer.
Responsibilities
- Storefront Governance: Serve as the absolute owner of product data integrity within the PIM. Ensure that all storefront-critical attributes (pricing, dimensions, weights, image links) are accurate and standardized for a seamless customer experience.
- Technical Data Auditing: Write and run complex SQL queries against our centralized database to identify anomalies, "orphan" records, and data hygiene issues that need resolution. You will be expected to query across multiple schemas to validate data consistency between systems.
- Feed Logic & Mapping: You will manage the logic of how data translates from our PIM to external endpoints. You will ensure that our products appear correctly on Google Shopping, Meta, Amazon, and other marketplaces by managing feed rules and mapping definitions.
- API Payload Analysis: You will act as the first line of defense for data transmission errors. If a product isn't showing up on the site, you will review the JSON/XML response bodies to determine if it is a data payload error or a software code bug.
- Cross-Functional Impact Analysis: You will act as the gatekeeper for data changes, predicting downstream impacts (e.g., "If Merchandising changes this Category Name, it will break the Finance reporting filter").
- Hygiene Logic Definition: You will partner with our IT/Database team to define automated health checks. You identify the "rot" (bad data patterns), and they implement the database constraints to stop it.
What You Will NOT Do (The Boundaries)
- No Web Development: You are not a Front-End Developer. You do not write HTML, CSS, or React code. You ensure the data powering those components is 100% accurate.
- No Manual Data Entry: Your job is not to copy-paste descriptions. You build the systems, bulk processes, and logic that ensure data quality at scale.
- No Database Administration: You do not manage server uptime or schema changes (IT owns this). You own the quality of the records inside the database.
Intersection with Technical Teams
- With IT (Database Mgmt): IT owns the infrastructure and schema; you own the quality of the data within it. When you identify a systemic issue (e.g., "5,000 orphan records"), you partner with IT to implement the technical fix (scripts/constraints).
- With Software Engineering (Commerce): If a product is missing from the site, you check the data payload. If the data is correct, you hand off to Engineering, confirming it is a code/caching bug rather than a data error.
Experience, Skills, & Ability Requirements
- 5-8 years of experience in Data Management, PIM Administration, or technical eCommerce Operations.
- SQL Proficiency: You are comfortable writing queries beyond simple SELECT *. You should be proficient with CTEs (Common Table Expressions), Window Functions (e.g., Rank, Lead/Lag), Subqueries, and complex Joins to act as a forensic data investigator.
- API Fluency: You can read and understand JSON and XML. You know what a valid payload looks like and can spot formatting errors or missing keys.
- Data Manipulation: You are an expert at handling large datasets (CSVs, Excel) and understand data types, formatting standards, and normalization concepts.
- You love hunting down the root cause of an error. You don't just fix the wrong price; you find out why the price was wrong and build a rule to stop it from happening again.
- You have high standards for accuracy. You understand that a wrong weight in the system means a financial loss on shipping for the business.
Bonus Points (Nice-to-Haves)
- Familiarity with Visio/Lucidchart to visualize data flows.
- Ability to build simple dashboards in Tableau to track data health scores.
- Basic familiarity with Python or R for data manipulation.
What We Offer
- Health, dental, and vision benefits
- Paid parental leave
- 401(k) with employer match
- A culture of meritocracy that fosters ongoing growth opportunities
- A stable, growing family-owned company that looks after its employees
Loloi Rugs does not discriminate on the basis of race, sex, color, religion, age, national origin, marital status, disability, veteran status, genetic information, sexual orientation, gender identity or any other reason prohibited by law in provision of employment opportunities and benefits. We seek a diverse pool of applicants and consider all qualified candidates regardless of race, ancestry, color, gender identity or expression, sexual orientation, religion, national origin, citizenship, disability, Veteran status, marital status, or any other protected status. If you have a special need or disability that requires accommodation, please let us know.
Job Description: The State of Connecticut (CT) is seeking a Digital Accessibility Web Developer with deep experience in remediating accessibility issues across a wide range of platforms and technologies.
You will partner closely with our accessibility testers and analysts to turn accessibility audit findings into fully remediated digital experiences that meet or exceed compliance standards.
The ideal candidate will have expert-level experience remediating accessibility barriers in CMS systems such as Sitecore, Salesforce, and custom web applications (HTML/ARIA/CSS/JavaScript), as well as working knowledge of AWS services, Biznet platforms, and enterprise databases.
You will be hands-on in HTML and accessibility markup remediation, working primarily within the State's CMS platforms and custom HTML environments.
You'll partner with digital accessibility testers to review audit findings and make front end code corrections to ensure WCAG 2.1 AA compliance.
Remediation Focus Areas Apply accessibility fixes to front-end code and markup issues identified through audits (i.e.
color corrections, alt text, heading structure, keyboard navigation, link roles, ARIA roles) Modify and restructure HTML, CSS, and ARIA to comply with WCAG 2.1 AA standards Work within CMS platforms like Sitecore, Salesforce, and Wordpress to correct issues in templates, content types, and presentation layers Support content and design teams with accessibility guidance for remediating documents, forms, and embedded media Use defect tracking tools (JIRA) to manage tickets and document fixes Collaborate with accessibility testers and content strategists to validate remediated work and prevent recurrence of issues Share knowledge and remediation patterns with other developers to promote consistency and sustainability Required Knowledge, Skills, and Ability Bachelor's degree in Computer Science, Software Engineering, IT, or related field 4 years of experience remediating digital accessibility issues in websites, apps, and platforms Strong coding experience in HTML, CSS, JavaScript, and ARIA markup Working knowledge of Sitecore and Salesforce platforms, with demonstrated remediation success Familiarity with Biznet applications, AWS infrastructure, or common enterprise back-end platforms Ability to interpret automated and manual testing results (e.g., Axe, ANDI, NVDA, JAWS) and apply solutions Expert knowledge of WCAG 2.1 AA standards and assistive technology interactions Proficiency in CMS templates, JavaScript frameworks, backend API configuration, and UI component libraries Experience troubleshooting keyboard traps, focus management, form label/field logic, and responsive layouts Strong ability to work in agile sprints, manage remediation tickets, and track progress in Jira or similar tools Ability to collaborate with QA testers, content editors, and project managers in an agile environment Excellent communication and documentation skills for communicating fixes and coaching teams Preferred Skills and Qualifications Experience with Sitecore MVC or SXA customization Front-end developer or CMS certifications Accessibility remediation tools Experience with customized CMS themes, templates, and components Strong attention to content structure (heading levels, alt text, semantic HTML) Experience remediating PDF, Word, or PowerPoint documents (for secondary support) Familiarity with CI/CD integration of accessibility checks (i.e., axe-core in pipelines) Familiarity with design handoff tools (i.e., Figma or Adobe XD) for accessibility review Desired Certifications One or more of the following: IAAP WAS (Web Accessibility Specialist) strongly preferred IAAP CPACC DHS Trusted Tester Certification Deque University Developer Track Certificate Salesforce Accessibility Champion or similar Prior PowerCenter → IDMC migration, Experience or familiarity with Linux system administration activities
Position Summary
Perform a variety of routine and complex skilled and technical work in the maintenance of a Geographic Information System (GIS) relating to the Public Works Computerized Maintenance Management System (CMMS) and asset management program. Act as the primary contact for Public Works CMMS data stewardship. Apply GIS technology to provide GIS and CMMS data related technical support. Perform research, analysis, design and creation of data and applications for use in the Geographic Information System. These tasks are illustrative only and may include other related duties.
Full-time 40 hours per week
AFSCME-represented position
12-month probationary period
Must meet all qualifications and requirements as listed in the position description.
Essential Duties
Collects, inputs, edits, and verifies spatial data from a variety of internal and external data inputs. Integrates associated attribute data. Manipulates, models, and analyzes spatial data in the geographic information system. Documents data entry and related procedures.
Maintains Public Works GIS datasets and mapping system. Applies GIS technology to produce and perform advanced data entry and manipulation, produces documentation, and performs spatial analysis. Develops and runs spatial queries and produces reports.
Modifies and maintains CMMS data to support asset data analysis. Collaborates with Asset Management staff and Public Works supervisors to assist in program development by gathering information for assets and other new and old data and information needed to allow the asset data system to function effectively; creates new codes for the above areas mentioned and when necessary modifies asset characteristics and descriptions.
Coordinates with Public Works program supervisors to efficiently and accurately enter data into the system. Collects and enters asset data into the CMMS and related databases from various sources including direct field investigation; documents such as as-built drawings, invoices, and O&M manuals.
Generates standard and ad-hoc reports using the standard report structure of the asset data system, and other end user reporting tools, provides information for the preparation and distribution of periodic standard location and equipment reports to support maintenance teams and management requests.
Performs quality control checks of asset data to ensure the accuracy of all data within the system.
Provides implementation and ongoing operational support for GIS/CMMS and GIS/CMMS users.
Provides system and data troubleshooting. Collaborates with IT to resolve system or data issues.
Develops programs, procedures, and applications using GIS and related software tools.
Applies software such as, CAD, database, spreadsheet, word processing, communications, graphics and web publishing software to the production and delivery of GIS related products.
Provides daily user support including routine troubleshooting and system and data maintenance for asset data analysis, working closely with Information Technology to evaluate responsibility for addressing specific requests.
Provides technical assistance and guidance to users of GIS products. Performs departmental-focused project management. Meets with GIS users to define project requirements and set priorities.
Participates on interdepartmental teams and committees for GIS and CMMS projects. Contributes to work group GIS software design projects. Maintains an understanding of the ESRI product portfolio and provides guidance for Public Works' use of available tools.
Operates printers, copiers and large-format plotters, and has ability to load large rolls of paper into plotters.
Acts ethically and honestly; applies ethical standards of behavior to daily work activities and interactions. Builds confidence in the City through own actions.
Conforms with all safety rules and performs work is a safe manner.
Delivers excellent customer service to diverse audiences. Maintains positive customer service demeanor and delivers service in a respectful and patient manner.
Maintains effective work relationships.
Arrives to work, meetings, and other work-related functions on time and maintains regular job attendance.
Complies will all Administrative Policies. Performs work in accordance with Council Policies and Municipal Code sections applicable to position.
Qualifications and Skills
Education and Experience
High School diploma or equivalent. Four years of professional experience in designing, supporting and implementing GIS applications. A post-secondary degree in GIS or closely related field may substitute for up to 4 years of experience.
Strong computer background in GIS software, Computer Aided Drafting software, related third party GIS software applications, database management systems software and windows based operating systems.
Municipal experience is desired.
Knowledge, Skills and Abilities
General knowledge of the principles, theories and methods of database concepts, structures, and programming logic; and the various types, classes, uses, and interrelationships of assets within a typical municipal Public Works department.
Advanced skills in use of GIS and CMMS related software in a production environment.
Ability to program in GIS, relational and spatial database, and web languages is desired.
Good oral and written communication skills; ability to communicate technical information to a non-technical audience, ability to research, interpret and summarize data.
Ability to prioritize multiple projects from numerous customers.
Knowledge of cartographic principles, spatial analysis techniques, and data management practices.
Ability to research and recommend new methods, equipment, or programs to better accomplish tasks.
Ability to travel among City worksites.
Special Requirements
Ability to pass a pre-employment background and/or criminal history check
Demonstrable commitment to sustainability.
Demonstrable commitment to promoting and enhancing equity, diversity and inclusion.
The individual shall not pose a direct threat to the health or safety of the individual or others in the workplace.
How to Apply
Qualified applicants must submit an online application located on the City of Corvallis website (click on "Apply" above).
Resumes will not be accepted in lieu of a completed online application. Incomplete applications will not be accepted/considered.
Position is open until filled.
First review of applications will occur after 8:00 am on February 4, 2026
*Please do not include personal or protected information in attached resumes or cover letters, this includes your birth date, age, dates of education, and graduation dates.*