Data Recognition Corporation Jobs in Usa
10,808 positions found — Page 4
Job Summary:
Our client is seeking a Data Steward to join their team! This position is located Hybrid in Creve Coeur, Missouri.
Duties:
- Understand business capability needs and processes as they relate to IT solutions through partnering with Product Managers and business and functional IT stakeholders
- Participate in data scraping, data curation and data compilation efforts
- Ensure high quality of the data to end users
- Ensure high quality of the inhouse data via data stewardship
- Implement and utilize data solutions for data analysis and profiling using a variety of tools such as SQL, Postman, R, or Python and following the team’s established processes and methodologies
- Collaborate with other data stewards and engineers within the team and across teams on aligning delivery dates and integration efforts
- Define data quality rules and implement automated monitoring, reporting, and remediation solutions
- Coordinate intake and resolution of data support tickets
- Support data migration from legacy systems, data inserts and updates not supported by applications
- Partner with the Data Governance organization to ensure data is secured and access is being managed appropriately
- Identify gaps within existing processes and capable of creating new documentation templates to improve the existing processes and procedures
- Create mapping documents and templates to improve existing manual processes
- Perform data discoveries to understand data formats, source systems, etc. and engage with business partners in this discovery process
- Help answer questions from the end-users and coordinate with technical resources as needed
- Build prototype SQL and continuously engage with end consumers with enhancements
Desired Skills/Experience:
- Bachelor's Degree in Computer Science, Engineering, Science, or other related field
- Applied experience with modern engineering technologies and data principles, for instance: Big Data Cloud Compute, NoSQL, etc..
- Applied experience with querying SQL and/orNoSQL databases
- Experience in designing data catalogs, including data design, metadata structures, object relations, catalog population, etc.
- Data Warehousing experience
- Strong written and verbal communication skills
- Comfortable balancing demands across multiple projects / initiatives
- Ability to identify gaps in requirements based on business subject matter domain expertise
- Ability to deliver detailed technical documentation
- Expert level experience in relevant business domain
- Experience managing data within SAP
- Experience managing data using APIs
- Big Query experience
Benefits:
- Medical, Dental, & Vision Insurance Plans
- Employee-Owned Profit Sharing (ESOP)
- 401K offered
The approximate pay range for this position starting at $104,000 - $115,000+ Please note that the pay range provided is a good faith estimate. Final compensation may vary based on factors including but not limited to background, knowledge, skills, and location. We comply with local wage minimums.
At KellyMitchell, our culture is world class. We’re movers and shakers! We don’t mind a bit of friendly competition, and we reward hard work with unlimited potential for growth. This is an exciting opportunity to join a company known for innovative solutions and unsurpassed customer service. We're passionate about helping companies solve their biggest IT staffing & project solutions challenges. As an employee-owned, women-led organization serving Fortune 500 companies nationwide, we deliver expert service at a moment's notice.
By applying for this job, you agree to receive calls, AI-generated calls, text messages, or emails from KellyMitchell and its affiliates, and contracted partners. Frequency varies for text messages. Message and data rates may apply. Carriers are not liable for delayed or undelivered messages. You can reply STOP to cancel and HELP for help. You can access our privacy policy at
About Wakefern
Wakefern Food Corp. is the largest retailer-owned cooperative in the United States and supports its co-operative members' retail operations, trading under the ShopRite®, Price Rite®, The Fresh Grocer®, Dearborn Markets®, and Gourmet Garage® banners.
Employing an innovative approach to wholesale business services, Wakefern focuses on helping the independent retailer compete in a big business world. Providing the tools entrepreneurs need to stay a step ahead of the competition, Wakefern’s co-operative members benefit from the company’s extensive portfolio of services, including innovative technology, private label development, and best-in-class procurement practices.
The ideal candidate will have a strong background in designing, developing, and implementing complex projects, with focus on automating data processes and driving efficiency within the organization. This role requires a close collaboration with application developers, data engineers, data analysts, data scientists to ensure seamless data integration and automation across various platforms. The Data Integration & AI Engineer is responsible for identifying opportunities to automate repetitive data processes, reduce manual intervention, and improve overall data accessibility.
Essential Functions
- Participate in the development life cycle (requirements definition, project approval, design, development, and implementation) and maintenance of the systems.
- Implement and enforce data quality and governance standards to ensure the accuracy and consistency.
- Provide input for project plans and timelines to align with business objectives.
- Monitor project progress, identify risks, and implement mitigation strategies.
- Work with cross-functional teams and ensure effective communication and collaboration.
- Provide regular updates to the management team.
- Follow the standards and procedures according to Architecture Review Board best practices, revising standards and procedures as requirements change and technological advancements are incorporated into the >tech_ structure.
- Communicates and promotes the code of ethics and business conduct.
- Ensures completion of required company compliance training programs.
- Is trained – either through formal education or through experience – in software / hardware technologies and development methodologies.
- Stays current through personal development and professional and industry organizations.
Responsibilities
- Design, build, and maintain automated data pipelines and ETL processes to ensure scalability, efficiency, and reliability across data operations.
- Develop and implement robust data integration solutions to streamline data flow between diverse systems and databases.
- Continuously optimize data workflows and automation processes to enhance performance, scalability, and maintainability.
- Design and develop end-to-end data solutions utilizing modern technologies, including scripting languages, databases, APIs, and cloud platforms.
- Ensure data solutions and data sources meet quality, security, and compliance standards.
- Monitor and troubleshoot automation workflows, proactively identifying and resolving issues to minimize downtime.
- Provide technical training, documentation, and ongoing support to end users of data automation systems.
- Prepare and maintain comprehensive technical documentation, including solution designs, specifications, and operational procedures.
Qualifications
- A bachelor's degree or higher in computer science, information systems, or a related field.
- Hands-on experience with cloud data platforms (e.g., GCP, Azure, etc.)
- Strong knowledge and skills in data automation technologies, such as Python, SQL, ETL/ELT tools, Kafka, APIs, cloud data pipelines, etc.
- Experience in GCP BigQuery, Dataflow, Pub/Sub, and Cloud storage.
- Experience with workflow orchestration tools such as Cloud Composer or Airflow
- Proficiency in iPaaS (Integration Platform as a Service) platforms, such as Boomi, SAP BTP, etc.
- Develop and manage data integrations for AI agents, connecting them to internal and external APIs, databases, and knowledge sources to expand their capabilities.
- Build and maintain scalable Retrieval-Augmented Generation (RAG) pipelines, including the curation and indexing of knowledge bases in vector databases (e.g., Pinecone, Vertex AI Vector Search).
- Leverage cloud-based AI/ML platforms (e.g., Vertex AI, Azure ML) to build, train, and deploy machine learning models on a scale.
- Establish and enforce data quality and governance standards for AI/ML datasets, ensuring the accuracy, completeness, and integrity of data used for model training and validation.
- Collaborate closely with data scientists and machine learning engineers to understand data requirements and deliver optimized data solutions that support the entire machine learning lifecycle.
- Hands-on experience with IBM DataStage and Alteryx is a plus.
- Strong understanding of database design principles, including normalization, indexing, partitioning, and query optimization.
- Ability to design and maintain efficient, scalable, and well-structured database schemas to support both analytical and transactional workloads,
- Familiarity with BI visualization tools such as MicroStrategy, Power BI, Looker, or similar.
- Familiarity with data modeling tools.
- Familiarity with DevOps practices for data (CI/CD pipelines)
- Proficiency in project management software (e.g., JIRA, Clarizen, etc.)
- Familiarity with DevOps practices for data (CI/CD pipelines)
- Strong knowledge and skills in data management, data quality, and data governance.
- Strong communication, collaboration, and problem-solving skills.
- Ability to work on multiple projects and prioritize tasks effectively.
- Ability to work independently and in a team environment.
- Ability to learn new technologies and tools quickly.
- The ability to handle stressful situations.
- Highly developed business acuity and acumen.
- Strong critical thinking and decision-making skills.
Working Conditions & Physical Demands
This position requires in-person office presence at least 4x a week.
Compensation and Benefits
The salary range for this position is $75,868 - $150,644. Placement in the range depends on several factors, including experience, skills, education, geography, and budget considerations.
Wakefern is proud to offer a comprehensive benefits package designed to support the health, well-being, and professional development of our Associates. Benefits include medical, dental, and vision coverage, life and disability insurance, a 401(k) retirement plan with company match & annual company contribution, paid time off, holidays, and parental leave.
Associates also enjoy access to wellness and family support programs, fitness reimbursement, educational and training opportunities through our corporate university, and a collaborative, team-oriented work environment. Many of these benefits are fully or partially funded by the company, with some subject to eligibility requirements
Who We Are
At Feetures, movement is our business. And we believe that a meaningful business begins with authentic values—and our values were forged by the bonds of family.
What started as a bold idea around a kitchen table has grown into a fast-moving, purpose-driven brand redefining performance. As a family-owned company in North Carolina, we’re fueled by the belief that better is always possible—and that energy drives both our products and our culture.
Movement is at the heart of everything we do. From our socks to our team and to our communities, we are always pushing forward. If you are ready to grow, challenge the status quo, and help shape the next chapter of a brand that is always in stride, come move with us. Feetures is Meant to Move. Are you?
Role Summary:
The Data Analytics Manager is responsible for owning and optimizing the organization’s end-to-end data ecosystem, ensuring that data infrastructure, governance, and analytics processes effectively support business operations. This role leads the design and management of the data stack—from source system integrations and NetSuite Analytics Warehouse to reporting and business intelligence tools—while establishing strong data governance standards, quality monitoring, and documentation practices. The manager also oversees and mentors analytics team members, prioritizes analytics requests, and coordinates cross-functional data workflows. Acting as the central authority for data reliability and insights, the role ensures consistent metric definitions, scalable data models, and accurate reporting while translating complex data into clear, actionable insights for business stakeholders.
Responsibilities:
Data Architecture & Tooling
- Own the end-to-end data stack — from source system integrations and the NetSuite Analytics Warehouse to downstream reporting layers
- Evaluate, select, and implement tools that improve data accessibility, reliability, and performance
- Ensure alignment between data infrastructure and evolving business needs across distribution operations
- Design and maintain scalable data models, SuiteQL queries, and saved searches within NetSuite
Data Governance & Quality
- Define and enforce data standards, metric definitions, and naming conventions across all business domains
- Establish data ownership, lineage documentation, and access governance policies
- Implement monitoring and alerting for data quality issues across source systems and the warehouse
- Build and maintain a data dictionary that serves as the single source of truth for the organization
Orchestration of Analysts & Systems
- Manage and mentor the Data Analyst and Business Analyst — prioritizing requests, unblocking work, and validating outputs
- Triage and prioritize the analytics request queue in alignment with business stakeholders and IT leadership
- Coordinate cross-functional data workflows and ensure handoffs between systems and analysts are clean and documented
- Serve as the escalation point for data discrepancies, report failures, and analytical questions from the business
Qualifications:
Required
- 3-5 years of experience in data analytics, business intelligence, or data engineering
- 2+ years in a lead or management role overseeing analysts or data team members
- Strong proficiency in SQL; experience with SuiteQL or similar ERP query languages
- Hands-on experience with NetSuite, including Analytics Warehouse, saved searches, and reporting
- Proven track record establishing data governance standards and documentation practices
- Experience integrating and managing multiple data sources across SaaS and ERP platforms
- Demonstrated ability to translate complex data into clear, actionable insights for non-technical stakeholders
Preferred
- Experience in distribution, wholesale, or supply chain environments
- Familiarity with SaaS BI platforms (e.g., Tableau, Power BI, Looker, or embedded analytics)
- Exposure to scripting or automation (JavaScript, Python, or similar) for data workflows
- Background working within IT-led or hybrid IT/Analytics teams
Benefits:
- Health insurance
- Dental insurance
- Vision insurance
- Life & Disability insurance
- 401(K) with company match
Company Paid holidays and PTO:
- Feetures offers 20 PTO Days which are available to you on day one of employment and are available to all employees, no matter your role. After working at Feetures for 5 years, your PTO days will increase to 25 days. Days can be used for vacations, appointments and sick days.
- We offer 10 company paid holidays and 1 floating holiday per year.
Perks:
- Parking provided (Charlotte office and onsite at Hickory office)
- Employee Engagement team
- Monthly stipend to pursue an active lifestyle
Feetures is an Equal Opportunity Employer that welcomes and encourages all applicants to apply regardless of age, race, sex, religion, color, national origin, disability, veteran status, sexual orientation, gender identity and/or expression, marital or parental status, ancestry, citizenship status, pregnancy or other reasons protected by law.
Job Description
The Data Quality Analyst / Databricks Implementation Specialist plays a key role in advancing the company’s enterprise data governance and Databricks Lakehouse strategy. This role partners closely with business data stewards, data owners, and technical teams to translate business data requirements into governed, high-quality datasets within Databricks Unity Catalog. The analyst will support domain onboarding, develop and operationalize data quality rules, perform profiling and analysis, and help implement enterprise standards for metadata, lineage, and semantic consistency.
Key Responsibilities
- Data Quality & Profiling
- Develop, document, and maintain data quality rules for critical data elements (CDEs).
- Perform data profiling, anomaly detection, and root-cause analysis.
- Partner with data stewards to validate definitions, thresholds, and business rules.
- Monitor and report on data quality metrics and remediation progress.
- Databricks Unity Catalog Implementation
- Support Unity Catalog rollout across domains, including catalog structure, tagging, and metadata standards.
- Assist with onboarding domains into the Bronze → Silver → Gold architecture.
- Ensure lineage, ownership, and quality rules are embedded into Databricks pipelines.
- Help implement domain-aligned access controls and sensitivity tagging.
- Collaboration with Data Stewards & Business Partners
- Work directly with business data stewards to understand data requirements and quality expectations.
- Translate business meaning into standardized CDEs and steward-approved metadata.
- Facilitate working sessions to align on semantics, domain boundaries, and data product requirements.
- Support consistent governance practices across domains.
- Metadata, Lineage, and Catalog Management
- Maintain high-quality metadata in the enterprise data catalog.
- Ensure CDEs, KPIs, and domain terms are accurately documented.
- Validate lineage from raw sources through refined layers.
- Data Analysis & Issue Resolution
- Investigate data issues raised by business users or downstream consumers.
- Perform impact analysis for schema changes or quality rule updates.
- Support remediation efforts with engineering and business teams.
Required Skills & Experience
3–5 years of experience in data quality, data governance, or data analysis.
Hands-on experience with Databricks, Delta Lake, or similar cloud platforms.
Strong understanding of data quality concepts.
Experience with metadata catalogs or governance tools.
Proficiency with SQL and data analysis.
Strong communication skills.
Nice to Have Skills & Experience
Experience with Databricks Unity Catalog.
Familiarity with Medallion Architecture.
Exposure to governance frameworks (DAMA, DCAM).
Experience collaborating with data stewards or data owners.
Knowledge of data modeling or semantic layers.
Pay Rate depending on background and experience ranging from $35-43/hr
Job Name: MDM Data Quality & Cleansing Specialist
Job Location: Wayne, PA, 19087 (2 days/week onsite is required - Team onsite day is Thursdays)
Duration: 6 Months with potential to extend
Working Hours: 8:30 am - 5:30 pm (some flexibility)
Interview Process: 1 45-minute virtual interview
Position Summary
The MDM Data Quality & Cleansing Specialist is responsible for supporting enterprise Master Data Management (MDM) initiatives by performing remediation of post–match merge fallout records and executing data cleansing activities across designated data domains. This position plays a critical role in ensuring the accuracy, consistency, and completeness of master data in accordance with established data governance policies, data quality standards, and operational procedures.
Responsibilities
- MDM Fallout Management
- Review and research fallout records generated from MDM match merge processes.
- Perform timely and accurate remediation of data exceptions in accordance with predefined business rules and governance standards.
- Validate survivorship outcomes and ensure that entity resolution results align with data stewardship expectations.
- Conduct root cause analysis to determine factors contributing to recurring data exceptions.
- Data Cleansing and Data Quality Support
- Execute data cleansing tasks including standardization, deduplication, formatting corrections, and attribute validation.
- Verify data completeness and accuracy using approved tools, templates, and quality checks.
- Perform bulk updates or corrections as authorized, following established protocols and change control requirements.
- Assist in monitoring data quality dashboards, reports, and exception queues.
- Data Stewardship Collaboration
- Collaborate with Data Governance, Data Stewards, business partners, and MDM Operations teams to resolve data issues requiring business input.
- Document remediation decisions and maintain required audit trails in accordance with compliance and governance standards.
- Support stewardship processes by escalating complex or policy related issues as appropriate.
Qualifications
Required
- Minimum of 2 years of experience in Master Data Management, Data Governance, Data Quality, or a related data operations role.
- Proficiency with Microsoft Excel (e.g., lookup functions, pivot tables, filtering, data cleaning techniques).
- Experience working with one or more MDM applications (e.g., Informatica or similar).
Preferred
- Experience with match merge or entity resolution workflows.
- Basic proficiency in SQL or other data manipulation/query tools.
- Familiarity with data governance frameworks, data quality rules, and metadata management principles.
- Prior experience working with party (customer, partner) master data.
As a Data Steward Senior Analyst, you are part of a team responsible for enabling and supporting compliance with data-related enterprise policies within their domains/business units. You and your team are responsible for identifying critical data and associated risks, maintaining data definitions, classifying data, supporting data sourcing / usage requests, measuring Data Risk Controls, and confirming Data Issues are remediated. You have the opportunity to partner across various business units, technology teams, and product/platform teams to define and implement the data governance strategy, supervising and leading data quality, resolving data/platform issues, and driving consistency, usability, and governance of specific product data across the enterprise.
In addition, this role will play a key part in effectively communicating new and updated data-related policies to the teams responsible for compliance. The individual must be skilled in preparing clear, engaging presentations that translate formal policy language into practical, easy-to-understand guidance and “tell the story” behind the policy requirements. The role will also support the delivery of training sessions, facilitate policy office hours, and serve as a go-to resource for questions related to data governance and retention compliance.
Your Primary Responsibilities may include:
• Assist in identifying data-related risks and associated controls for key business processes. Risks relate to Record Retention (primary), Data Quality, Data Movement, Data Stewardship, Data Protection, Data Sharing, among others.
• Develop training materials and educate organization on Record Retention and Deletion processes and procedures.
• Develop deep understanding of key enterprise data-related policies and serve as the policy expert for the business unit, providing education to teams regarding policy implications for business.
• Collaborate with and influence product managers to ensure all new use cases are managed according to policies.
• Influence and contribute to strategic improvements to data assessment processes and analytical tools.
• Support current regulatory reporting needs via existing platforms, working with upstream data providers, downstream business partners, as well as technology teams.
• Subject matter expertise on multiple platforms.
• Responsible to partner with the Data Steward Manager in developing and managing the data compliance roadmap.
Qualifications include:
• 5 + years of experience in a similar role involved with ensuring compliance with Record Retention and Deletion policies.
• Strong communication skills and ability to influence and engage at multiple levels and cross functionally.
• Intermediate understanding of Data Management and Data Governance concepts (metadata, lineage, data quality, etc.) and prior experience.
• 5+ years of Data Quality Management experience.
• Strong familiarity with data architecture and/or data modeling concepts
• 5+ years of experience with Agile or SAFe project methodologies
• Bachelor’s degree in Finance, Engineering, Mathematics, Statistics, Computer Science or other similar fields.
• Preferred: Experience in Travel Industry.
• Preferred: Knowledge of RCSA (Risk Control Self-Assessment) methodology
Leadership Skills may include:
• Makes Decisions Quickly and Effectively: Drives effective outcome through decision making authority. Displays judgement and discretion in order to ensure deliverables are sufficient to the American Express policy and overall compliance.
• Drives Innovation & Change: Provides systematic and rational analysis to identify the root cause of problems. Is prepared to challenge the status quo and drive innovation. Makes informed judgments, recommends tailored solutions.
• Leverages Team - Collaboration: Coordinates efforts within and across teams to deliver goals, accountable to bring in ideas, information, suggestions, and expertise from others outside & inside the immediate team.
• Communication: Influences and holds others accountable and has ability to convince others. Identifies the specific data governance requirements and is able to communicate clearly and in a compelling way.
Loloi Rugs is a leading textile brand that designs and crafts rugs, pillows, and throws for the thoughtfully layered home. Family-owned and led since 2004, Loloi is growing more quickly than ever. To date, we’ve expanded our diverse team to hundreds of employees, invested in multiple distribution facilities, introduced thousands of products, and earned the respect and business of retailers and designers worldwide. A testament to our products and our team, Loloi has earned the ARTS Award for “Best Rug Manufacturer” in 2010, 2011, 2015, 2016, 2018, 2023, and 2025.
Security Advisory: Beware of Frauds
Protect yourself from potential fraud and verify the authenticity of any job offer you receive from Loloi. Rest assured that we never request payment or demand any sensitive personal information, such as bank details or social security numbers, at any stage of the recruiting process. To ensure genuine communication, our recruiters will solely reach out to applicants using an @ email address. Your security is of paramount importance to us at Loloi, and we are committed to maintaining a safe and trustworthy hiring experience for all candidates.
We are building a Business Operations Center of Excellence, and we need a Product Data Analyst to serve as the "Guardian of the Golden Record." In this role, you are the absolute owner of product data integrity as it relates to the digital customer experience. You ensure that every item we sell is accurately represented across every touchpoint—from our ERP and PIM to our website storefront and marketing feeds. This is not a data entry role; it is a high-impact technical logic and investigation role. You will work directly with our Data Platform and Software Engineering teams to define business rules, audit data health via complex SQL, and troubleshoot data transmission errors before they impact the customer.
Responsibilities
- Storefront Governance: Serve as the absolute owner of product data integrity within the PIM. Ensure that all storefront-critical attributes (pricing, dimensions, weights, image links) are accurate and standardized for a seamless customer experience.
- Technical Data Auditing: Write and run complex SQL queries against our centralized database to identify anomalies, "orphan" records, and data hygiene issues that need resolution. You will be expected to query across multiple schemas to validate data consistency between systems.
- Feed Logic & Mapping: You will manage the logic of how data translates from our PIM to external endpoints. You will ensure that our products appear correctly on Google Shopping, Meta, Amazon, and other marketplaces by managing feed rules and mapping definitions.
- API Payload Analysis: You will act as the first line of defense for data transmission errors. If a product isn't showing up on the site, you will review the JSON/XML response bodies to determine if it is a data payload error or a software code bug.
- Cross-Functional Impact Analysis: You will act as the gatekeeper for data changes, predicting downstream impacts (e.g., "If Merchandising changes this Category Name, it will break the Finance reporting filter").
- Hygiene Logic Definition: You will partner with our IT/Database team to define automated health checks. You identify the "rot" (bad data patterns), and they implement the database constraints to stop it.
What You Will NOT Do (The Boundaries)
- No Web Development: You are not a Front-End Developer. You do not write HTML, CSS, or React code. You ensure the data powering those components is 100% accurate.
- No Manual Data Entry: Your job is not to copy-paste descriptions. You build the systems, bulk processes, and logic that ensure data quality at scale.
- No Database Administration: You do not manage server uptime or schema changes (IT owns this). You own the quality of the records inside the database.
Intersection with Technical Teams
- With IT (Database Mgmt): IT owns the infrastructure and schema; you own the quality of the data within it. When you identify a systemic issue (e.g., "5,000 orphan records"), you partner with IT to implement the technical fix (scripts/constraints).
- With Software Engineering (Commerce): If a product is missing from the site, you check the data payload. If the data is correct, you hand off to Engineering, confirming it is a code/caching bug rather than a data error.
Experience, Skills, & Ability Requirements
- 5-8 years of experience in Data Management, PIM Administration, or technical eCommerce Operations.
- SQL Proficiency: You are comfortable writing queries beyond simple SELECT *. You should be proficient with CTEs (Common Table Expressions), Window Functions (e.g., Rank, Lead/Lag), Subqueries, and complex Joins to act as a forensic data investigator.
- API Fluency: You can read and understand JSON and XML. You know what a valid payload looks like and can spot formatting errors or missing keys.
- Data Manipulation: You are an expert at handling large datasets (CSVs, Excel) and understand data types, formatting standards, and normalization concepts.
- You love hunting down the root cause of an error. You don't just fix the wrong price; you find out why the price was wrong and build a rule to stop it from happening again.
- You have high standards for accuracy. You understand that a wrong weight in the system means a financial loss on shipping for the business.
Bonus Points (Nice-to-Haves)
- Familiarity with Visio/Lucidchart to visualize data flows.
- Ability to build simple dashboards in Tableau to track data health scores.
- Basic familiarity with Python or R for data manipulation.
What We Offer
- Health, dental, and vision benefits
- Paid parental leave
- 401(k) with employer match
- A culture of meritocracy that fosters ongoing growth opportunities
- A stable, growing family-owned company that looks after its employees
Loloi Rugs does not discriminate on the basis of race, sex, color, religion, age, national origin, marital status, disability, veteran status, genetic information, sexual orientation, gender identity or any other reason prohibited by law in provision of employment opportunities and benefits. We seek a diverse pool of applicants and consider all qualified candidates regardless of race, ancestry, color, gender identity or expression, sexual orientation, religion, national origin, citizenship, disability, Veteran status, marital status, or any other protected status. If you have a special need or disability that requires accommodation, please let us know.
Title: Senior Data Analyst
Duration: Long term
Location: Dallas , TX
Job Description:
Primary responsibilities of the Senior Data Analyst include supporting and analyzing data anomalies for multiple environments including but not limited to Data Warehouse, ODS, Data Replication/ETL Data Management initiatives. The candidate will be in a supporting role and will work closely with Business, DBA, ETL and Data Management team providing analysis and support for complex Data related initiatives. This individual will also be responsible for assisting in initial setup and on-going documentation/configuration related to Data Governance and Master Data Management solutions. This candidate must have a passion for data, along with good SQL, analytical and communication skills.
Responsibilities
- Investigate and Analyze data anomalies and data issues reported by Business
- Work with ETL, Replication and DBA teams to determine data transformations, data movement and derivations and document accordingly
- Work with support teams to ensure consistent and pro-active support methodologies are adhered to for all aspects of data movements and data transformations
- Assist in break fix and production validation as it relates to data derivations, replication and structures
- Assist in configuration and on-going setup of Data Virtualization and Master Data Management tools
- Assist in keeping documentation up to date as it relates to Data Standardization definitions, Data Dictionary and Data Lineage
- Gather information from various Sources and interpret Patterns and Trends
- Ability to work in a team-oriented, fast-paced agile environment managing multiple
- priorities
Qualifications
- 4+ years of SQL experience working in OLTP, Data Warehouse and Big Data databases
- 4+ years of experience working with Exadata and SQL Server databases
- 4+ years in a Data Analyst role
- Strong attention to Detail
- 2+ years writing medium to complex stored procedures a plus
- Ability to collaborate effectively and work as part of a team
- Extensive background in writing complex queries
- Extensive working knowledge of all aspects of Data Movement and Processing, including ETL, API, OLAP and best practices for data tracking
- Good Communication skills
- Self-Motivated
- Works well in a team environment
- Denodo Experience a plus
- Master Data Management a plus
- Big Data Experience a plus (Hadoop, MongoDB)
- Postgres and Cloud Experience a plus
Get Hired by taking action.
If you just graduated (or you're about to) and the job search is already feeling confusing, you're not imagining it.
A degree proves you can learn—but employers hire for job readiness: projects that look like real work, current tech stacks, interview confidence, and the ability to contribute on day one.
That's why many new grads send hundreds of applications and still hear nothing back.
It's not because you're "not smart enough.” It's because most entry-level pipelines are crowded, and hiring teams filter heavily for candidates who look production-ready.
We are actively considering candidates for entry-level software engineering and data roles, especially Java full stack, Java/Python development, DevOps automation, data analytics, data engineering, data science, and ML/AI—full-time opportunities aligned to client needs.
Our core emphasis remains Java/Full Stack/DevOps and Data/Analytics/Engineering/ML.
SynergisticIT focuses on two high-demand lanes: Java / Full Stack / DevOps and Data (Data Analyst, Data Engineer, Data Scientist) + ML/AI—so you don't graduate with scattered skills, you graduate with an employable stack.
SynergisticIT since 2010, has helped candidates land full-time roles at major organizations (examples often cited include Google, Apple, PayPal, Visa, Western Union, Wells Fargo, Client, Banking, Wayfair, Client, Client, and more) with offers commonly in the $95k–$154k range depending on role and skill depth.
For a new grad, the bigger message isn't the number—it's that results require a structured pathway, not random applications.
Here's a realistic way to think about your advantage as a fresh graduate: you're early enough to build the right foundation before bad habits set in.
If you master fundamentals—coding, debugging, data structures, system thinking—and then layer modern tools on top (frameworks, cloud, CI/CD, analytics stacks), you become the kind of "entry-level” candidate who actually feels like a safe hire.
What roles are companies hiring for right now? A typical market demand pattern is clear: organizations still need entry-level software programmers, Java full stack developers, Python/Java developers, DevOps-focused engineers, and on the data side data analysts, BI analysts, data engineers, data scientists, and machine learning engineers.
The strongest candidates aren't "tool collectors”—they're people who can show end-to-end capability: build an API, connect a database, deploy a service, analyze data, explain results, and handle interviews calmly.
Why fresh grads get stuck— Fresh grads often struggle for four predictable reasons: Resume doesn't match job keywords (ATS filters you out).
Projects look like school assignments (not production-aligned).
Interview skills are undertrained (DSA, system design, SQL, behavioral).
No structured pipeline (random applying without feedback loops).
A job-placement-first approach addresses these systematically: build the right portfolio, practice the right interview questions, align your tech stack to roles, and keep improving until the market says "yes.” Who this path fits best If you're a recent graduate, you'll likely fit if you match any of these: New grads in CS, Engineering, Math, or Statistics with limited job experience Students finishing Bachelor's or Master's programs who need a real hiring plan Candidates who apply consistently but don't get callbacks Candidates who reach interviews but struggle to close International students on F-1/OPT who need a job plan for STEM extension/H-1B timing Graduates with strong academics but thin practical experience SynergisticIT helps STEM extension and work authorization pathways, and for candidates who need long-term stability, support related to H-1B and green card processes as part of employer-side realities.
If you're tired of guessing, stop treating your job search like a lottery.
Treat it like a project with milestones: skills → portfolio → interview readiness → targeted applications → scheduled interviews → offer.
If you want to explore, here are the key links: Event videos (OCW, JavaOne, Gartner): USA Today feature Contact & get a roadmap: Please read our blogs Why do Tech Companies not Hire recent Computer Science Graduates | SynergisticIT What Recruiters Look for in Junior Developers | SynergisticIT Software engineering or Data Science as a career? How OPT Students Can Land Tech Jobs – SynergisticIT Bottom line for fresh grads: Your degree is the starting line, not the finish line.
If you want to get hired faster, you don't need "more random courses.” You need a guided, job-focused path and the right people around you.
In tech, it's not just what you learn—it's how you learn and who you build with that decides how far you go.
Please note: Resume databases are shared with clients and interested clients will reach out directly if they find a qualified candidate for their req.
Resume submissions may be shared with our JOPP team database also.
Please unsubscribe if contacted or if you don't want to be contacted please don't submit your resume
This is a 03 months contract opportunity with long-term potential and is located in U.S(Remote).
Please review the job description below and contact me ASAP if you are interested.
Job ID:26-08963 Pay Range: $22
- $23/hour.
Employee benefits include, but are not limited to, health insurance (medical, dental, vision), 401(k) plan, and paid sick leave (depending on work location).
Key Responsibilities: Submit a minimum of 8 submission per 8-hour shift.
This can change as we are going to process changes.
Review request, research, and submit changes per regulation/business rules.
The main function of a data entry specialist is to operate data entry devices, such as a keyboard or computer, to verify and input data.
A typical data entry specialist is responsible for accurate information documentation and personal project management.
Read source documents such as practitioner profiles, emails, and enter data in specific data fields or onto tapes or disks for subsequent entry, using keyboards or scanners.
Compile, sort and verify the accuracy of data before it is entered.
Locate and correct data entry errors or report them to supervisors.
Compare data with source documents, or re-enter data in verification format to detect errors.
Maintain logs of activities and completed work.
Key Requirements and Technology Experience: Key Skills;Technical skills include documentation skills and time management.
Health plan experience, data entry experience, Previous experience with computer applications, such as Microsoft Word and Excel.
3-5 years of data entry experience is required.
A High School Diploma or GED is required.
Our client is a leading Healthcare Industry, and we are currently interviewing to fill this and other similar contract positions.
If you are interested in this position, please apply online for immediate consideration.
Pyramid Consulting, Inc.
provides equal employment opportunities to all employees and applicants for employment and prohibits discrimination and harassment of any type without regard to race, colour, religion, age, sex, national origin, disability status, genetics, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state or local laws.
By applying to our jobs you agree to receive calls, AI-generated calls, text messages, or emails from Pyramid Consulting, Inc.
and its affiliates, and contracted partners.
Frequency varies for text messages.
Message and data rates may apply.
Carriers are not liable for delayed or undelivered messages.
You can reply STOP to cancel and HELP for help.
You can access our privacy policy here .