Data Migration Vs Database Migration Jobs in Usa
11,597 positions found — Page 3
Visa Status: US Citizen or Green Card Only
Location: Irving, TX (Local Candidates Only)
Employment Type: Full-time / Direct Hire
Work Environment: Hybrid (Monday thru Thursday - in office / Friday - at home)
***MUST HAVE 10+ YEARS EXPERIENCE AS A DATA ENGINEER***
***US Citizen or Green Card Only***
The AWS Senior Data Engineer will own the planning, design, and implementation of data structures for this leading Hospitality Corporation in their AWS environment. This role will be responsible for incorporating all internal and external data sources into a robust, scalable, and comprehensive data model within AWS to support business intelligence and analytics needs throughout the company.
Responsibilities:
- Collaborate with cross-functional teams to understand and define business intelligence needs and translate them into data modeling solutions
- Develops, builds and maintains scalable data pipelines, data schema design, and dimensional data modelling in Databricks and AWS for all system data sources, API integrations, and bespoke data ingestion files from external sources. Includes Batch and real-time pipelines.
- Responsible for data cleansing, standardization, and quality control
- Create data models that will support comprehensive data insights, business intelligence tools, and other data science initiatives
- Create data models and ETL procedures with traceability, data lineage and source control
- Design and implement data integration and data quality framework
- Implement data monitoring best practices with trigger based alerts for data processing KPIs and anomalies
- Investigate and remediate data problems, performing and documenting thorough and complete root cause analyses. Make recommendation for mitigation and prevention of future issues.
- Work with Business and IT to assess efficacy of all legacy data sources, making recommendations for migration, anonymization, archival and/or destruction.
- Continually seek to optimize performance through database indexing, query optimization, stored procedures, etc.
- Ensure compliance with data governance and data security requirements, including data life cycle management, purge and traceability.
- Create and manage documentation and change control mechanisms for all technical design, implementations and systems maintenance.
Target Skills and Experience
- Bachelor's or graduate degree in computer science, information systems or related field preferred, or similar combination of education and experience
- At least 10 years’ experience designing and managing data pipelines, schema modeling, and data processing systems.
- Experience with Databricks a plus (or similar tools like Microsoft Fabric, Snowflake, etc.) to drive scalable data solutions.
- Experience with SAP a plus
- Proficient in Python, with a track record of solving real-world data challenges.
- Advanced SQL skills, including experience with database design, query optimization, and stored procedures.
- Experience with Terraform or other infrastructure-as-code tools is a plus.
The University of Maryland (UMD) seeks a Manager of Data Analytics Enablement to lead the adoption and modernization of enterprise analytics capabilities that enable trusted, data-informed decision-making across campus.
This is an exciting time to join UMD as we advance enterprise data and analytics through a period of innovative growth and modernization.
This role will play a key part in shaping the future of enterprise business intelligence, advancing Microsoft Power BI and Fabric capabilities, and embedding sustainable data quality and stewardship practices into analytics workflows.
Reporting to the Director of Enterprise Data Services, this position partners with institutional leaders, IT teams, and enterprise stakeholders to deliver reliable data products, consistent metrics, and actionable insights.
The manager will lead a team of data professionals and advance practical, operational governance practices that support trusted analytics and long-term institutional impact.
Key Responsibilities: Lead the strategy, development, and continuous improvement of the university’s enterprise business intelligence environment, including Microsoft Power BI and Microsoft Fabric.
Establish standards, best practices, and architectural patterns for semantic models, dashboards, and analytics delivery.
Guide migration and modernization efforts to ensure scalable, secure, and high-performing analytics solutions.
Develop and manage an analytics intake, prioritization, and delivery framework aligned with institutional priorities.
Define and implement data quality monitoring practices to ensure reliability, accuracy, and consistency of enterprise data assets.
Partner with technical teams to embed validation, monitoring, and observability into data pipelines and lakehouse environments.
Promote consistent metric definitions and collaborate with campus stakeholders to clarify data ownership and stewardship roles.
Support adoption of metadata management, data catalog, and lineage capabilities.
Ensure analytics solutions align with university standards for security, privacy, and responsible data use.
Manage, mentor, and develop a team of analytics and data professionals, fostering a culture of quality, collaboration, and service.
Communicate analytics priorities, progress, and impact to leadership and campus partners.
**This position is considered essential and may be required to work at the normal work location or an alternative location during a major catastrophic event, weather emergency, or other operational emergency to help maintain the continuity of University services.
** **May be required to work evenings, nights, weekends, or different shifts for extended periods.
** KNOWLEDGE, SKILLS, & ABILITIES: Knowledge of data privacy and security principles and practices necessary to protect systems and data from threats.
Knowledge in areas of subject matter expertise such as databases, data modeling, ETL, reporting, data governance practices, metadata management, data stewardship, and/or regulatory compliance.
Skill in SQL or programming/scripting languages (e.g.; Python) used for integrations, data pipelines, report development, and data management.
Skill in adapting communication style to different audiences, including technical, business, and executive stakeholders.
Skill in the use of office productivity software such as Office 365 or Google Workspaces.
Ability to lead presentations and training for large groups.
Ability to manage communications and relationships with technical and business stakeholders.
Ability to collaborate effectively with other Managers, Assistant Directors, and Directors to identify and solve problems, make improvements, and address ongoing issues.
Ability to provide a team with effective direction and support in implementations using standards and techniques that lead to a repeatable and reliable solution.
Ability to ensure documentation standards and procedures are implemented for all team responsibilities.
Ability to define deadlines and manage the quality of the work delivered.
Ability to comprehend and handle interpersonal dynamics, demonstrate empathy towards team members, and effectively manage conflicts or challenging circumstances.
Ability to coach and mentor team members in order to enhance their performance, provide constructive feedback, and support skill development.
Physical Demands: Sedentary work.
Exerting up to 10 pounds of force occasionally and/or negligible amount of force frequently or constantly to lift, carry, push, pull or otherwise move objects.
Repetitive motion.
Substantial movements (motions) of the wrists, hands, and/or fingers.
The worker is required to have close visual acuity to perform an activity such as: preparing and analyzing data and figures; transcribing; viewing a computer terminal; extensive reading.
Minimum Qualifications Education: Bachelor’s degree from an accredited college or university.
Experience: Three (3) years of professional experience supporting the operations, maintenance, and administration of data systems, analytics platforms, or data management programs.
One (1) year leading or supervising professional staff.
Other: Additional work experience as defined above may be substituted on a year for year basis for up to four (4) years of the required education.
Preferences: Demonstrated experience leading business intelligence or enterprise analytics initiatives.
Experience managing or mentoring data professionals in a collaborative team environment.
Strong experience with Power BI and modern data platforms such as Microsoft Fabric, Databricks, or similar cloud-based analytics ecosystems.
Proficiency with SQL and/or Python in support of analytics, data modeling, or data quality initiatives.
Experience implementing or advancing data quality practices, including validation, monitoring, or metric standardization.
Experience supporting practical data governance activities such as establishing shared definitions, coordinating data stewardship, or implementing metadata/catalog tools.
Demonstrated ability to collaborate across diverse stakeholders and translate business needs into scalable analytics solutions.
Strong communication skills with the ability to engage both technical and non-technical audiences.
Experience using Jira or similar tools for work intake, project tracking, and prioritization.
Additional Information: Please note that all positions within the Division of Information Technology (DIT) have an in person component with expected time in our College Park, MD location per week.
Telework is not a guaranteed work arrangement.
Visa Sponsorship Information: DIT will not sponsor the successful candidate for work authorization in the United States now or in the future.
F1 STEM OPT support is not available for this position.
Required Application Materials: Resume, Cover Letter, List of three References Best Consideration Date: March 26, 2026 Open Until Filled: Yes Salary Range: $149,120.00
- $178,944.00 Please apply at: Job Risks: Not Applicable to This Position Financial Disclosure Required: No For more information on Financial Disclosure, please visit Maryland's State Ethics Commission website .
Department: DIT-EE-Enterprise Data Services Worker Sub-Type: Staff Regular Benefits Summary: For more information on Regular Exempt benefits, select this link .
Background Checks: Offers of employment are contingent on completion of a background check.
Information reported by the background check will not automatically disqualify anyone from employment.
Before any adverse decision, the finalist will have an opportunity to provide information to the University regarding disclosable background check information.
The University reserves the right to rescind the offer of employment or otherwise decline or terminate employment if the information reported by the background check is deemed incompatible with the position, regardless of when the background check is completed.
Employment Eligibility: The successful candidate must complete employment eligibility verification (on Form I-9) by presenting documents that establish identity and work authorization within the timeframe required by federal immigration law, and where applicable, to demonstrate renewed employment authorization.
Failure to complete employment eligibility verification or reverification within the timeframe set forth by law may result in suspension or termination of employment.
EEO Statement : The University of Maryland, College Park is an Equal Opportunity Employer.
All qualified applicants will receive equal consideration for employment.
Please read the University’s Equal Employment Opportunity Statement of Policy.
Title IX Non-Discrimination Notice See above description for requirements
Job Description Summary
We are seeking a highly skilled and experienced Senior Data Engineer to join our dynamic team. In this role, you will be instrumental in designing, building, and maintaining robust and scalable data pipelines and solutions within the Microsoft Azure ecosystem. You will be responsible for developing and optimizing ETL/ELT processes, ensuring data quality, and enabling efficient data access for analytics and business intelligence. We are looking for a hands-on engineer who thrives in a fast-paced environment and is passionate about leveraging cutting-edge technologies
Key Responsibilities:
Design, develop, and maintain cloud-based data pipelines and ETL/ELT workflows.
Build and optimize data architectures to support structured and unstructured data processing.
Collaborate with data analysts, data scientists, and business stakeholders to understand data needs.
Implement data quality, security, and governance best practices.
Monitor and troubleshoot data workflows to ensure high availability and performance.
Optimize database and data storage solutions for performance and cost efficiency.
Contribute to cloud adoption, migration, and modernization initiatives.
Mandatory Skills:
Strong expertise with Azure cloud platform.
Strong experience in Databricks
Azure Data Factory proficiency required; building datasets, data flows, and pipelines in ADF (not just maintaining something already built)
Hands-on experience with ETL/ELT tools and frameworks.
Proficiency in SQL, Python, and data modeling.
Knowledge of CI/CD pipelines and infrastructure-as-code tools.
Understanding of data governance, security, and compliance.
Preferred Skills:
Exposure to API integration and microservices architecture.
Strong analytical and problem-solving skills.
Azure cloud certifications and/or past experience
AKS (Azure Kubernetes Service) experience, and ETL related to applications containerized & deployed on AKS (or EKS)
Purpose
The IT Database Engineer is responsible for designing, implementing, and supporting relational database platforms in both traditional data centers and Azure cloud environments. The role covers installation, configuration, performance tuning, high availability, backup and recovery, monitoring, and incident response for Microsoft SQL Server, MySQL, and PostgreSQL, with participation in an on-call rotation to support mission-critical workloads.
Key Responsibilities
- Install, configure, and upgrade MSSQL, MySQL, and PostgreSQL in data center and Azure environments (IaaS and/or PaaS as applicable).
- Perform day-to-day database administration, including user and role management, permissions, schema changes, and maintenance tasks.
- Monitor database health, performance, and capacity using native and third-party tools; define meaningful alerts and dashboards for proactive issue detection.
- Troubleshoot database incidents (performance issues, blocking/deadlocks, failed jobs, connectivity problems, resource constraints) and drive root-cause analysis and permanent fixes.
- Design, implement, and maintain backup and recovery strategies (full/diff/log, PITR, snapshots, Azure backup options) and regularly test restore procedures.
- Implement and support high availability and disaster recovery configurations (e.g., SQL Server Always On, failover clustering, log shipping, MySQL/Postgres replication, Azure availability sets/zones).
- Optimize database performance through indexing strategies, query tuning, statistics management, and configuration tuning at both OS and database levels.
- Implement and enforce security controls (authentication, authorization, encryption at rest/in transit, auditing) aligned with organizational and regulatory requirements.
- Support application and development teams with database design, query optimization, and controlled deployment of schema changes across environments.
- Maintain detailed documentation including runbooks, standards, topology diagrams, data flows, and operational procedures for both on-prem and Azure deployments.
- Participate in an on-call rotation, responding to after-hours incidents, and perform planned maintenance during maintenance windows.
- Automate routine tasks (provisioning, checks, patching, reporting) using scripts and tooling (e.g., T-SQL, PowerShell, Bash, Python, Azure CLI).
Required Qualifications
- Proven experience as a Database Engineer/DBA supporting MSSQL, MySQL, and PostgreSQL in production environments.
- Hands-on experience managing databases in traditional data centers (physical/virtual servers) and Azure (e.g., SQL Server on Azure VMs, Azure SQL Database, Azure Database for MySQL/PostgreSQL or similar).
- Strong understanding of database internals: storage structures, indexing, transactions, isolation levels, and locking.
- Demonstrated skills in performance troubleshooting and tuning using execution plans, wait statistics, and monitoring metrics.
- Practical experience with HA/DR solutions and backup/restore strategies, including testing and documentation of failover/recovery procedures.
- Proficiency with scripting/automation for database operations and integration with operational tooling.
- Familiarity with networking, OS, and virtualization concepts relevant to database performance and connectivity (subnets, firewalls, load balancers, storage latency).
- Solid understanding of security best practices for databases.
Preferred Qualifications
- Experience with Azure-native monitoring and management tools (e.g., Azure Monitor, Log Analytics, Alerts, Managed Identities, Key Vault).
- Experience with CI/CD and database change automation, including schema versioning and deployment pipelines.
- Exposure to large-scale or high-volume databases, partitioning, and scaling strategies (vertical/horizontal).
- Knowledge of regulatory and compliance requirements related to data (e.g., PCI, HIPAA, GDPR) and data protection techniques (masking, tokenization).
- Relevant certifications (e.g., Microsoft Azure, SQL Server, MySQL, PostgreSQL).
Soft Skills
- Strong analytical and problem-solving skills, especially under time pressure during incidents and on-call situations.
- Clear communication skills to work effectively with developers, infrastructure teams, security, and business stakeholders.
- High sense of ownership for data integrity, availability, and reliability, with a structured approach to documentation and process.
Working Conditions (travel, hours, environment)
- Limited travel required including air and car travel
- While performing the duties of this job, the employee is occasionally exposed to a warehouse environment and moving vehicles. The noise level in the work environment is typically quiet to moderate.
Physical/Sensory Requirements
Sedentary Work – Ability to exert 10 - 20 pounds of force occasionally, and/or negligible amount of force frequently to lift, carry, push, pull or otherwise move objects. Sedentary work involves sitting most of the time but may involve walking or standing for brief periods of time.
Benefits & Rewards
- Bonus opportunities at every level
- Non-traditional retail hours (we close at 7p!)
- Career advancement opportunities
- Relocation opportunities across the country
- 401k with discretionary company match
- Employee Stock Purchase Plan
- Referral Bonus Program
- 80 hrs. annualized paid vacation (full-time associates)
- 4 paid holidays per year (full-time hourly store associates only)
- 1 paid personal holiday of associate’s choice and Volunteer Time Off program
- Medical, Dental, Vision, Life and other Insurance Plans (subject to eligibility criteria)
Equal Employment Opportunity
Floor & Decor provides equal employment opportunities to all associates and applicants without regard to age, race, color, religion or creed, national origin or ancestry, sex (including pregnancy), sexual orientation, gender, gender identity, disability, veteran status, genetic information, ethnicity, citizenship, or any other category protected by law.
This policy applies to all areas of employment, including recruitment, testing, screening, hiring, selection for training, upgrading, transfer, demotion, layoff, discipline, termination, compensation, benefits and all other privileges, terms and conditions of employment. This policy and the law prohibit employment discrimination against any associate or applicant on the basis of any legally protected status outlined above.
Summary
We are seeking a highly skilled Data Engineer to build and manage our data infrastructure. The ideal candidate will be an expert in writing complex SQL queries, designing efficient database schemas, and developing ETL/ELT pipelines. You will ensure data is accurate, accessible, and optimized for performance to support business intelligence, analytics, and reporting needs.
Key Responsibilities
- Database Design & Management: Design, develop, and maintain relational databases (e.g. SQL Server, ProgressSQL, Oracle) and cloud-based data warehouses.
- Strategic SQL and Data Engineering: Develop sophisticated, optimized SQL queries, stored procedures, and functions to process and analyze large, complex datasets for actionable business insights.
- Data Pipeline Automation & Orchestration:Help build, automate, and orchestrate ETL/ELT workflows utilizing SQL, Python, and cloud-native tools to integrate and transform data from diverse, distributed sources.
- Performance Optimization: Tune queries and optimize database schema (indexing, partitioning, normalization) to improve data retrieval and processing speeds.
- Data Integrity & Security: Ensure data quality, consistency, and integrity across systems. Implement data masking, encryption, and role-based access control (RBAC).
- Documentation: Maintain technical documentation for database schemas, data dictionaries, and ETL workflows.
Required Skills and Qualifications
- Education: Bachelor’s degree in computer science, Information Systems, or a related field.
- SQL Mastery: 5+ years of experience with advanced SQL (window functions, CTEs, query optimization).
- Database Expertise: Deep understanding of relational database management systems (RDBMS) and data modeling techniques.
- Cloud Platforms: Demonstrated experience with Azure Data Services and other data warehouse technologies.
- Programming: Proficiency in Python for scripting and data manipulation.
- ETL Tools: Familiarity with tools like SSIS or Azure Data Factory.
- Soft Skills: Strong analytical thinking, problem-solving, and communication skills.
Nice to Have
- Experience with NoSQL databases (Cosmos DB, MongoDB).
- Experience with big data frameworks (Apache Spark, Kafka).
- Relevant certifications (e.g., Microsoft Certified: Azure Data Engineer Associate, Google Professional Data Engineer).
Typical Work Environment
- Tools Used: SQL IDEs (DBeaver, SSMS), Cloud Consoles, Git, Jira, SSIS.
- Industry: Leasing.
Salary is $130-$140k
Job Title- Sr Data Center Implementation Engineer (VMware & Cisco UCS)
Work Location: Full Onsite Bethesda, Maryland
Visa: USC, GC Only
Interview Process: Video
Rate:- $65/hr on W2(Some flexibility)
Key Responsibilities:
VMware Infrastructure Deployment & Migration:
Design, implement, and manage VMware vSphere and vCenter environments.
Plan and execute the migration of VMware ESXi workloads to VMware Cloud Foundation (VCF).
Use VMware tools and best practices to facilitate seamless migration from ESXi to VCF, ensuring minimal downtime and no data loss.
Configure and optimize VMware ESXi hosts, clusters, and virtual machines on VCF platforms for performance and scalability.
VMware Cloud Foundation (VCF) Management:
Oversee the deployment and management of VMware Cloud Foundation (VCF), integrating it with existing VMware environments.
Manage and maintain VCF components, including vSphere, vSAN, NSX, and vRealize Suite, to provide a fully automated cloud infrastructure platform.
Monitor VCF health, capacity, and performance, and provide recommendations for optimization.
Cisco UCS Management:
Implement, configure, and maintain Cisco UCS hardware, including servers, fabric interconnects, and chassis.
Create and manage server profiles and policies within Cisco UCS Manager.
Troubleshoot and perform firmware upgrades on Cisco UCS components, ensuring seamless integration with VMware.
Storage Solutions:
Implement and maintain storage solutions within VMware environments, including SAN/NAS and VMware vSAN.
Integrate storage into VMware clusters, ensuring redundancy, high availability, and performance.
Provide support for storage provisioning, LUNs, datastores, and VMware storage policies.
Automation & Scripting:
Leverage automation tools such as PowerCLI, Ansible, or Python to streamline VMware and Cisco UCS operations.
Automate common tasks and configuration changes to improve efficiency and reduce manual effort.
Project Implementation & Delivery:
Collaborate with cross-functional teams to ensure smooth and timely delivery of IT infrastructure projects.
Assist in the creation of project timelines, risk assessments, and detailed implementation plans.
Provide high-level and granular configuration documentation for systems and solutions.
Support & Troubleshooting:
Provide ongoing support for VMware and Cisco UCS environments, ensuring minimal downtime.
Troubleshoot hardware, software, and configuration issues within both VMware and Cisco UCS environments.
Monitor systems for performance and capacity issues, responding to alerts and ensuring optimal system uptime.
Required Qualifications:
At least 5 years of hands-on experience with VMware vSphere and vCenter in enterprise environments, including experience with VMware Cloud Foundation (VCF) for managing cloud infrastructure
Strong experience with Cisco UCS hardware, including servers, fabric interconnects, and UCS Manager.
Experience working with VMware storage technologies (vSAN, iSCSI, NFS, etc.).
Proficiency in configuring, optimizing, and managing VMware clusters, virtual machines, and storage within VMware Cloud Foundation (VCF)
Familiarity with server provisioning and resource management in VMware environments.
Strong troubleshooting skills for both VMware and Cisco UCS systems.
Experience with scripting tools like PowerCLI or Python to automate VMware and UCS tasks, including automation for VCF management.
Good knowledge of system monitoring, backup, and disaster recovery processes in virtualized environments, including VCF-based solutions.
Preferred Qualifications:
VMware Certified Professional (VCP) certification.
Experience with VMware NSX, VMware vRealize Suite, VMware Cloud Foundation (VCF), or other VMware enterprise tools.
Experience with storage systems such as Pure Storage, NetApp, or similar platforms.
Familiarity with cloud environments or hybrid cloud solutions.
Knowledge of automation frameworks (Ansible, Terraform) for infrastructure management.
Work Environment:
Office environment with the potential for occasional client visits.
Requirement for full-time on-site presence at customer location.
Some travel may be required for implementation, support, or training.
Best Regards,
Jaideep Shastri
916-365-9533 (D) |
About Wakefern
Wakefern Food Corp. is the largest retailer-owned cooperative in the United States and supports its co-operative members' retail operations, trading under the ShopRite®, Price Rite®, The Fresh Grocer®, Dearborn Markets®, and Gourmet Garage® banners.
Employing an innovative approach to wholesale business services, Wakefern focuses on helping the independent retailer compete in a big business world. Providing the tools entrepreneurs need to stay a step ahead of the competition, Wakefern’s co-operative members benefit from the company’s extensive portfolio of services, including innovative technology, private label development, and best-in-class procurement practices.
The ideal candidate will have a strong background in designing, developing, and implementing complex projects, with focus on automating data processes and driving efficiency within the organization. This role requires a close collaboration with application developers, data engineers, data analysts, data scientists to ensure seamless data integration and automation across various platforms. The Data Integration & AI Engineer is responsible for identifying opportunities to automate repetitive data processes, reduce manual intervention, and improve overall data accessibility.
Essential Functions
- Participate in the development life cycle (requirements definition, project approval, design, development, and implementation) and maintenance of the systems.
- Implement and enforce data quality and governance standards to ensure the accuracy and consistency.
- Provide input for project plans and timelines to align with business objectives.
- Monitor project progress, identify risks, and implement mitigation strategies.
- Work with cross-functional teams and ensure effective communication and collaboration.
- Provide regular updates to the management team.
- Follow the standards and procedures according to Architecture Review Board best practices, revising standards and procedures as requirements change and technological advancements are incorporated into the >tech_ structure.
- Communicates and promotes the code of ethics and business conduct.
- Ensures completion of required company compliance training programs.
- Is trained – either through formal education or through experience – in software / hardware technologies and development methodologies.
- Stays current through personal development and professional and industry organizations.
Responsibilities
- Design, build, and maintain automated data pipelines and ETL processes to ensure scalability, efficiency, and reliability across data operations.
- Develop and implement robust data integration solutions to streamline data flow between diverse systems and databases.
- Continuously optimize data workflows and automation processes to enhance performance, scalability, and maintainability.
- Design and develop end-to-end data solutions utilizing modern technologies, including scripting languages, databases, APIs, and cloud platforms.
- Ensure data solutions and data sources meet quality, security, and compliance standards.
- Monitor and troubleshoot automation workflows, proactively identifying and resolving issues to minimize downtime.
- Provide technical training, documentation, and ongoing support to end users of data automation systems.
- Prepare and maintain comprehensive technical documentation, including solution designs, specifications, and operational procedures.
Qualifications
- A bachelor's degree or higher in computer science, information systems, or a related field.
- Hands-on experience with cloud data platforms (e.g., GCP, Azure, etc.)
- Strong knowledge and skills in data automation technologies, such as Python, SQL, ETL/ELT tools, Kafka, APIs, cloud data pipelines, etc.
- Experience in GCP BigQuery, Dataflow, Pub/Sub, and Cloud storage.
- Experience with workflow orchestration tools such as Cloud Composer or Airflow
- Proficiency in iPaaS (Integration Platform as a Service) platforms, such as Boomi, SAP BTP, etc.
- Develop and manage data integrations for AI agents, connecting them to internal and external APIs, databases, and knowledge sources to expand their capabilities.
- Build and maintain scalable Retrieval-Augmented Generation (RAG) pipelines, including the curation and indexing of knowledge bases in vector databases (e.g., Pinecone, Vertex AI Vector Search).
- Leverage cloud-based AI/ML platforms (e.g., Vertex AI, Azure ML) to build, train, and deploy machine learning models on a scale.
- Establish and enforce data quality and governance standards for AI/ML datasets, ensuring the accuracy, completeness, and integrity of data used for model training and validation.
- Collaborate closely with data scientists and machine learning engineers to understand data requirements and deliver optimized data solutions that support the entire machine learning lifecycle.
- Hands-on experience with IBM DataStage and Alteryx is a plus.
- Strong understanding of database design principles, including normalization, indexing, partitioning, and query optimization.
- Ability to design and maintain efficient, scalable, and well-structured database schemas to support both analytical and transactional workloads,
- Familiarity with BI visualization tools such as MicroStrategy, Power BI, Looker, or similar.
- Familiarity with data modeling tools.
- Familiarity with DevOps practices for data (CI/CD pipelines)
- Proficiency in project management software (e.g., JIRA, Clarizen, etc.)
- Familiarity with DevOps practices for data (CI/CD pipelines)
- Strong knowledge and skills in data management, data quality, and data governance.
- Strong communication, collaboration, and problem-solving skills.
- Ability to work on multiple projects and prioritize tasks effectively.
- Ability to work independently and in a team environment.
- Ability to learn new technologies and tools quickly.
- The ability to handle stressful situations.
- Highly developed business acuity and acumen.
- Strong critical thinking and decision-making skills.
Working Conditions & Physical Demands
This position requires in-person office presence at least 4x a week.
Compensation and Benefits
The salary range for this position is $75,868 - $150,644. Placement in the range depends on several factors, including experience, skills, education, geography, and budget considerations.
Wakefern is proud to offer a comprehensive benefits package designed to support the health, well-being, and professional development of our Associates. Benefits include medical, dental, and vision coverage, life and disability insurance, a 401(k) retirement plan with company match & annual company contribution, paid time off, holidays, and parental leave.
Associates also enjoy access to wellness and family support programs, fitness reimbursement, educational and training opportunities through our corporate university, and a collaborative, team-oriented work environment. Many of these benefits are fully or partially funded by the company, with some subject to eligibility requirements
Loloi Rugs is a leading textile brand that designs and crafts rugs, pillows, and throws for the thoughtfully layered home. Family-owned and led since 2004, Loloi is growing more quickly than ever. To date, we’ve expanded our diverse team to hundreds of employees, invested in multiple distribution facilities, introduced thousands of products, and earned the respect and business of retailers and designers worldwide. A testament to our products and our team, Loloi has earned the ARTS Award for “Best Rug Manufacturer” in 2010, 2011, 2015, 2016, 2018, 2023, and 2025.
Security Advisory: Beware of Frauds
Protect yourself from potential fraud and verify the authenticity of any job offer you receive from Loloi. Rest assured that we never request payment or demand any sensitive personal information, such as bank details or social security numbers, at any stage of the recruiting process. To ensure genuine communication, our recruiters will solely reach out to applicants using an @ email address. Your security is of paramount importance to us at Loloi, and we are committed to maintaining a safe and trustworthy hiring experience for all candidates.
We are building a Business Operations Center of Excellence, and we need a Product Data Analyst to serve as the "Guardian of the Golden Record." In this role, you are the absolute owner of product data integrity as it relates to the digital customer experience. You ensure that every item we sell is accurately represented across every touchpoint—from our ERP and PIM to our website storefront and marketing feeds. This is not a data entry role; it is a high-impact technical logic and investigation role. You will work directly with our Data Platform and Software Engineering teams to define business rules, audit data health via complex SQL, and troubleshoot data transmission errors before they impact the customer.
Responsibilities
- Storefront Governance: Serve as the absolute owner of product data integrity within the PIM. Ensure that all storefront-critical attributes (pricing, dimensions, weights, image links) are accurate and standardized for a seamless customer experience.
- Technical Data Auditing: Write and run complex SQL queries against our centralized database to identify anomalies, "orphan" records, and data hygiene issues that need resolution. You will be expected to query across multiple schemas to validate data consistency between systems.
- Feed Logic & Mapping: You will manage the logic of how data translates from our PIM to external endpoints. You will ensure that our products appear correctly on Google Shopping, Meta, Amazon, and other marketplaces by managing feed rules and mapping definitions.
- API Payload Analysis: You will act as the first line of defense for data transmission errors. If a product isn't showing up on the site, you will review the JSON/XML response bodies to determine if it is a data payload error or a software code bug.
- Cross-Functional Impact Analysis: You will act as the gatekeeper for data changes, predicting downstream impacts (e.g., "If Merchandising changes this Category Name, it will break the Finance reporting filter").
- Hygiene Logic Definition: You will partner with our IT/Database team to define automated health checks. You identify the "rot" (bad data patterns), and they implement the database constraints to stop it.
What You Will NOT Do (The Boundaries)
- No Web Development: You are not a Front-End Developer. You do not write HTML, CSS, or React code. You ensure the data powering those components is 100% accurate.
- No Manual Data Entry: Your job is not to copy-paste descriptions. You build the systems, bulk processes, and logic that ensure data quality at scale.
- No Database Administration: You do not manage server uptime or schema changes (IT owns this). You own the quality of the records inside the database.
Intersection with Technical Teams
- With IT (Database Mgmt): IT owns the infrastructure and schema; you own the quality of the data within it. When you identify a systemic issue (e.g., "5,000 orphan records"), you partner with IT to implement the technical fix (scripts/constraints).
- With Software Engineering (Commerce): If a product is missing from the site, you check the data payload. If the data is correct, you hand off to Engineering, confirming it is a code/caching bug rather than a data error.
Experience, Skills, & Ability Requirements
- 5-8 years of experience in Data Management, PIM Administration, or technical eCommerce Operations.
- SQL Proficiency: You are comfortable writing queries beyond simple SELECT *. You should be proficient with CTEs (Common Table Expressions), Window Functions (e.g., Rank, Lead/Lag), Subqueries, and complex Joins to act as a forensic data investigator.
- API Fluency: You can read and understand JSON and XML. You know what a valid payload looks like and can spot formatting errors or missing keys.
- Data Manipulation: You are an expert at handling large datasets (CSVs, Excel) and understand data types, formatting standards, and normalization concepts.
- You love hunting down the root cause of an error. You don't just fix the wrong price; you find out why the price was wrong and build a rule to stop it from happening again.
- You have high standards for accuracy. You understand that a wrong weight in the system means a financial loss on shipping for the business.
Bonus Points (Nice-to-Haves)
- Familiarity with Visio/Lucidchart to visualize data flows.
- Ability to build simple dashboards in Tableau to track data health scores.
- Basic familiarity with Python or R for data manipulation.
What We Offer
- Health, dental, and vision benefits
- Paid parental leave
- 401(k) with employer match
- A culture of meritocracy that fosters ongoing growth opportunities
- A stable, growing family-owned company that looks after its employees
Loloi Rugs does not discriminate on the basis of race, sex, color, religion, age, national origin, marital status, disability, veteran status, genetic information, sexual orientation, gender identity or any other reason prohibited by law in provision of employment opportunities and benefits. We seek a diverse pool of applicants and consider all qualified candidates regardless of race, ancestry, color, gender identity or expression, sexual orientation, religion, national origin, citizenship, disability, Veteran status, marital status, or any other protected status. If you have a special need or disability that requires accommodation, please let us know.
Job Description: The State of Connecticut (CT) is seeking a Digital Accessibility Web Developer with deep experience in remediating accessibility issues across a wide range of platforms and technologies.
You will partner closely with our accessibility testers and analysts to turn accessibility audit findings into fully remediated digital experiences that meet or exceed compliance standards.
The ideal candidate will have expert-level experience remediating accessibility barriers in CMS systems such as Sitecore, Salesforce, and custom web applications (HTML/ARIA/CSS/JavaScript), as well as working knowledge of AWS services, Biznet platforms, and enterprise databases.
You will be hands-on in HTML and accessibility markup remediation, working primarily within the State's CMS platforms and custom HTML environments.
You'll partner with digital accessibility testers to review audit findings and make front end code corrections to ensure WCAG 2.1 AA compliance.
Remediation Focus Areas Apply accessibility fixes to front-end code and markup issues identified through audits (i.e.
color corrections, alt text, heading structure, keyboard navigation, link roles, ARIA roles) Modify and restructure HTML, CSS, and ARIA to comply with WCAG 2.1 AA standards Work within CMS platforms like Sitecore, Salesforce, and Wordpress to correct issues in templates, content types, and presentation layers Support content and design teams with accessibility guidance for remediating documents, forms, and embedded media Use defect tracking tools (JIRA) to manage tickets and document fixes Collaborate with accessibility testers and content strategists to validate remediated work and prevent recurrence of issues Share knowledge and remediation patterns with other developers to promote consistency and sustainability Required Knowledge, Skills, and Ability Bachelor's degree in Computer Science, Software Engineering, IT, or related field 4 years of experience remediating digital accessibility issues in websites, apps, and platforms Strong coding experience in HTML, CSS, JavaScript, and ARIA markup Working knowledge of Sitecore and Salesforce platforms, with demonstrated remediation success Familiarity with Biznet applications, AWS infrastructure, or common enterprise back-end platforms Ability to interpret automated and manual testing results (e.g., Axe, ANDI, NVDA, JAWS) and apply solutions Expert knowledge of WCAG 2.1 AA standards and assistive technology interactions Proficiency in CMS templates, JavaScript frameworks, backend API configuration, and UI component libraries Experience troubleshooting keyboard traps, focus management, form label/field logic, and responsive layouts Strong ability to work in agile sprints, manage remediation tickets, and track progress in Jira or similar tools Ability to collaborate with QA testers, content editors, and project managers in an agile environment Excellent communication and documentation skills for communicating fixes and coaching teams Preferred Skills and Qualifications Experience with Sitecore MVC or SXA customization Front-end developer or CMS certifications Accessibility remediation tools Experience with customized CMS themes, templates, and components Strong attention to content structure (heading levels, alt text, semantic HTML) Experience remediating PDF, Word, or PowerPoint documents (for secondary support) Familiarity with CI/CD integration of accessibility checks (i.e., axe-core in pipelines) Familiarity with design handoff tools (i.e., Figma or Adobe XD) for accessibility review Desired Certifications One or more of the following: IAAP WAS (Web Accessibility Specialist) strongly preferred IAAP CPACC DHS Trusted Tester Certification Deque University Developer Track Certificate Salesforce Accessibility Champion or similar Prior PowerCenter → IDMC migration, Experience or familiarity with Linux system administration activities
Position Summary
Perform a variety of routine and complex skilled and technical work in the maintenance of a Geographic Information System (GIS) relating to the Public Works Computerized Maintenance Management System (CMMS) and asset management program. Act as the primary contact for Public Works CMMS data stewardship. Apply GIS technology to provide GIS and CMMS data related technical support. Perform research, analysis, design and creation of data and applications for use in the Geographic Information System. These tasks are illustrative only and may include other related duties.
Full-time 40 hours per week
AFSCME-represented position
12-month probationary period
Must meet all qualifications and requirements as listed in the position description.
Essential Duties
Collects, inputs, edits, and verifies spatial data from a variety of internal and external data inputs. Integrates associated attribute data. Manipulates, models, and analyzes spatial data in the geographic information system. Documents data entry and related procedures.
Maintains Public Works GIS datasets and mapping system. Applies GIS technology to produce and perform advanced data entry and manipulation, produces documentation, and performs spatial analysis. Develops and runs spatial queries and produces reports.
Modifies and maintains CMMS data to support asset data analysis. Collaborates with Asset Management staff and Public Works supervisors to assist in program development by gathering information for assets and other new and old data and information needed to allow the asset data system to function effectively; creates new codes for the above areas mentioned and when necessary modifies asset characteristics and descriptions.
Coordinates with Public Works program supervisors to efficiently and accurately enter data into the system. Collects and enters asset data into the CMMS and related databases from various sources including direct field investigation; documents such as as-built drawings, invoices, and O&M manuals.
Generates standard and ad-hoc reports using the standard report structure of the asset data system, and other end user reporting tools, provides information for the preparation and distribution of periodic standard location and equipment reports to support maintenance teams and management requests.
Performs quality control checks of asset data to ensure the accuracy of all data within the system.
Provides implementation and ongoing operational support for GIS/CMMS and GIS/CMMS users.
Provides system and data troubleshooting. Collaborates with IT to resolve system or data issues.
Develops programs, procedures, and applications using GIS and related software tools.
Applies software such as, CAD, database, spreadsheet, word processing, communications, graphics and web publishing software to the production and delivery of GIS related products.
Provides daily user support including routine troubleshooting and system and data maintenance for asset data analysis, working closely with Information Technology to evaluate responsibility for addressing specific requests.
Provides technical assistance and guidance to users of GIS products. Performs departmental-focused project management. Meets with GIS users to define project requirements and set priorities.
Participates on interdepartmental teams and committees for GIS and CMMS projects. Contributes to work group GIS software design projects. Maintains an understanding of the ESRI product portfolio and provides guidance for Public Works' use of available tools.
Operates printers, copiers and large-format plotters, and has ability to load large rolls of paper into plotters.
Acts ethically and honestly; applies ethical standards of behavior to daily work activities and interactions. Builds confidence in the City through own actions.
Conforms with all safety rules and performs work is a safe manner.
Delivers excellent customer service to diverse audiences. Maintains positive customer service demeanor and delivers service in a respectful and patient manner.
Maintains effective work relationships.
Arrives to work, meetings, and other work-related functions on time and maintains regular job attendance.
Complies will all Administrative Policies. Performs work in accordance with Council Policies and Municipal Code sections applicable to position.
Qualifications and Skills
Education and Experience
High School diploma or equivalent. Four years of professional experience in designing, supporting and implementing GIS applications. A post-secondary degree in GIS or closely related field may substitute for up to 4 years of experience.
Strong computer background in GIS software, Computer Aided Drafting software, related third party GIS software applications, database management systems software and windows based operating systems.
Municipal experience is desired.
Knowledge, Skills and Abilities
General knowledge of the principles, theories and methods of database concepts, structures, and programming logic; and the various types, classes, uses, and interrelationships of assets within a typical municipal Public Works department.
Advanced skills in use of GIS and CMMS related software in a production environment.
Ability to program in GIS, relational and spatial database, and web languages is desired.
Good oral and written communication skills; ability to communicate technical information to a non-technical audience, ability to research, interpret and summarize data.
Ability to prioritize multiple projects from numerous customers.
Knowledge of cartographic principles, spatial analysis techniques, and data management practices.
Ability to research and recommend new methods, equipment, or programs to better accomplish tasks.
Ability to travel among City worksites.
Special Requirements
Ability to pass a pre-employment background and/or criminal history check
Demonstrable commitment to sustainability.
Demonstrable commitment to promoting and enhancing equity, diversity and inclusion.
The individual shall not pose a direct threat to the health or safety of the individual or others in the workplace.
How to Apply
Qualified applicants must submit an online application located on the City of Corvallis website (click on "Apply" above).
Resumes will not be accepted in lieu of a completed online application. Incomplete applications will not be accepted/considered.
Position is open until filled.
First review of applications will occur after 8:00 am on February 4, 2026
*Please do not include personal or protected information in attached resumes or cover letters, this includes your birth date, age, dates of education, and graduation dates.*