Data Migration Vs Database Migration Jobs Remote Jobs in Usa
75 positions found
Minimum of five years experience working in analytics with hospitals and health plans.
Advanced proficiency required with VBA, SQL, Salesforce, Excel and Access.
High-level skills using web applications and all browsers; ability to teach others how to use web-based database functions.
Demonstrated experience using Microsoft Office computer applications, including Word, Access, Outlook and SharePoint.
Advanced knowledge of Excel required.
Detail-oriented with strong follow-through and ability to work independently given standard guidelines and checklists.
Good writing and communication skills.
Able to draft grammatically correct and professional email messages.
Demonstrated experience in working successfully with minimal supervision.
Must have knowledge of medical and health care terminology.
Ability to complete HIPAA training and implement high-level protections on patient information and confidentiality.
Must work effectively independently and in a team setting.
Ability to relate well with internal and external customers.
Quality/Metrics: Gather and perform analysis on data from Salesforce, Loopback, Excel, and other databases as required.
Perform data cleaning as needed to ensure data are consistent and analyzable.
Create data reports, charts, graphs and tables for regular reporting to program leads and external partners.
Export data from software systems and program tracking logs for agency reporting.
Assemble reports, papers and presentation materials as directed.
Collect data through phone and in-person interviews.
Record or transcribe data in accordance with project and funding source guidelines.
Perform literature reviews (locating, listing &/or abstracting articles).
Enter literature references into shared database (such as EndNote) Responsibilities: Data cleaning, formatting, and maintenance as needed.
Data visualization and analysis of program metrics.
Data Entry for the program(s) assigned.
Program reporting/billing/invoicing support.
Administrative duties as needed (Mailing and other assigned work) Establish and maintain systems for program accountability – reports track performance.
Attend and ensure follow up after all meetings and presentations – minutes, reports, action plans, assignments, and etc.
Monitors performance, responsibilities of field staff with respect to database management, metrics, and documents.
Reports all errors in systems, workflows, and both internal and external individuals.
Completes reporting (both internal and contractual requirements) with thorough knowledge and understanding of what is being reported.
Develops and maintains a current understanding of the Department’s Contractual Agreements.
Must have professional verbal and written skills, computer/software skills.
Assists with both internal and external customer service calls, emails, and requests.
Other Miscellaneous tasks assigned, as needed.
SQL Server database design, implementation, troubleshooting Develop, optimize, and maintain complex T-SQL queries, stored procedures, indexes, constraints; resolve performance issues, deadlocks, and contentions using traces, execution plans, and profiling.
Design, develop, test, and implement ETL/ELT processes using Talend for data extraction, transformation, and loading from diverse sources, including Salesforce CRM data.
Administer and optimize Talend environment, including job scheduling, dependencies, monitoring, automation, patches, upgrades, and performance tuning.
Integrate Salesforce data (e.g., via APIs, connectors) into SQL Server databases and data warehouses, ensuring data quality, synchronization, and real-time/ batch processing.
Collaborate face-to-face/with business stakeholders to analyze requirements, gather specifications, evaluate data sources/targets, and design solutions that improve business performance.
Lead ETL development activities, ensure code quality, provide feedback on performance.
Support enterprise data warehouse, data marts, and business intelligence initiatives; perform source data analysis and dimensional modeling.
Develop and automate processes using scripting.
Provide tier 2/3 support, evaluate production issues, recommend improvements, and participate in project planning following Agile methodologies.
Perform proactive performance optimization, and data synchronization across environments Mentor staff, recommend process enhancements, and contribute specialized knowledge across IT and business operations.
Document data integration processes, workflows, ETL designs, data mappings, technical specifications, and system configurations Manage version control, deployments Collaborate on testing (unit, integration, UAT Translated business requirements into actionable data specifications, documentation, and code solutions using Salesforce Object Manager and official documentation Reviewed Salesforce release notes, verified production deployments, and conducted feature testing across sandbox and production environments with detailed feedback submission Developed and maintained complex SOQL queries to support data team operations, reporting, and analytics needs Designed and built custom Salesforce reports to support data operations and Enhanced Care Management (ECM) programs Developed and deployed end-to-end solutions for processing health plan MIF data, enabling efficient insert, update, and reporting workflows for Lead and Case objects Performed large-scale data inserts, updates, and migrations using Salesforce Data Loader in both sandbox and production environments Extracted, analyzed, and transformed backend Salesforce data using Talend and SQL to produce accurate reports for compliance, billing, and operational needs Identified and resolved reporting discrepancies and data quality issues through root-cause analysis and targeted corrections Cleaned, standardized, and transformed referral data for mass uploads into Salesforce while enforcing validation rules and workflow requirements Created Salesforce-based error reports that enabled program teams to quickly identify and correct data entry issues Conducted data gap analyses against vendor reporting requirements and designed field transformations and new data structures to meet compliance and reporting standards Integrated offshore datasets with Salesforce records to address missing or incomplete data, improving accuracy for reporting and billing Reduced manual data entry and correction efforts by automating large-scale updates, inserts, and fixes via Salesforce Data Loader Maintained vendor zip code records in Salesforce to ensure accurate service area tracking, correct billing rates, and reliable historical reference Partners in Care Foundation is an equal opportunity employer.
We are committed to complying with all federal, state, and local laws providing equal employment opportunities, and all other employment laws and regulations.
It is our intent to maintain a work environment which is free of harassment, discrimination, or retaliation because of age, race (including hair texture and protective hairstyles, such as braids, locks, and twists), color, national origin, ancestry, religion, sex, sexual orientation, pregnancy (including childbirth, lactation/breastfeeding, and related medical conditions), physical or mental disability, genetic information (including testing and characteristics, as well as those of family members), veteran status, uniformed service member status, gender, gender identity, gender expression, transgender status, arrest or conviction record, domestic violence victim status, credit history, unemployment status, caregiver status, sexual and reproductive health decisions, salary history or any other status protected by federal, state, or local laws.
All qualified applicants will receive consideration for employment and reasonable accommodations may be made to enable qualified individuals to perform the essential functions of the position.
Remote working/work at home options are available for this role.
Location: Anywhere in Country
At EY, we’re all in to shape your future with confidence.
We’ll help you succeed in a globally connected powerhouse of diverse teams and take your career wherever you want it to go. Join EY and help to build a better working world.
AI & Data - Data Architecture – Senior Manager – Power & Utilities Sector
EY is seeking a motivated professional with solid experience in the utilities sector to serve as a Senior Manager who possesses a robust background in Data Architecture, Data Modernization, End to end Data capabilities, AI, Gen AI, Agentic AI, preferably with a power systems / electrical engineering background and having delivered business use cases in Transmission / Distribution / Generation / Customer. The ideal candidate will have a history of working for consulting companies and be well-versed in the fast-paced culture of consulting work. This role is dedicated to the utilities sector, where the successful candidate will craft, deploy, and maintain large-scale AI data ready architectures.
The opportunity
You will help our clients enable better business outcomes while working in the rapidly growing Power & Utilities sector. You will have the opportunity to lead and develop your skill set to keep up with the ever-growing demands of the modern data platform. During implementation you will solve complex analytical problems to bring data to insights and enable the use of ML and AI at scale for your clients. This is a high growth area and a high visibility role with plenty of opportunities to enhance your skillset and build your career.
As a Senior Manager in Data Architecture, you will have the opportunity to lead transformative technology projects and programs that align with our organizational strategy to achieve impactful outcomes. You will provide assurance to leadership by managing timelines, costs, and quality, and lead both technical and non-technical project teams in the development and implementation of cutting-edge technology solutions and infrastructure. You will have the opportunity to be face to face with external clients and build new and existing relationships in the sector. Your specialized knowledge in project and program delivery methods, including Agile and Waterfall, will be instrumental in coaching others and proposing solutions to technical constraints.
Your key responsibilities
In this pivotal role, you will be responsible for the effective management and delivery of one or more processes, solutions, and projects, with a focus on quality and effective risk management. You will drive continuous process improvement and identify innovative solutions through research, analysis, and best practices. Managing professional employees or supervising team members to deliver complex technical initiatives, you will apply your depth of expertise to guide others and interpret internal/external issues to recommend quality solutions. Your responsibilities will include:
As Data Architect – Senior Manager, you will have an expert understanding of data architecture and data engineering and will be focused on problem-solving to design, architect, and present findings and solutions, leading more junior team members, and working with a wide variety of clients to sell and lead delivery of technology consulting services. You will be the go-to resource for understanding our clients’ problems and responding with appropriate methodologies and solutions anchored around data architectures, platforms, and technologies. You are responsible for helping to win new business for EY. You are a trusted advisor with a broad understanding of digital transformation initiatives, the analytic technology landscape, industry trends and client motivations. You are also a charismatic communicator and thought leader, capable of going toe-to-toe with the C-level in our clients and prospects and willing and able to constructively challenge them.
Skills and attributes for success
To thrive in this role, you will need a combination of technical and business skills that will make a significant impact. Your skills will include:
- Technical Skills Applications Integration
- Cloud Computing and Cloud Computing Architecture
- Data Architecture Design and Modelling
- Data Integration and Data Quality
- AI/Agentic AI driven data operations
- Experience delivering business use cases in Transmission / Distribution / Generation / Customer.
- Strong relationship management and business development skills.
- Become a trusted advisor to your clients’ senior decision makers and internal EY teams by establishing credibility and expertise in both data strategy in general and in the use of analytic technology solutions to solve business problems.
- Engage with senior business leaders to understand and shape their goals and objectives and their corresponding information needs and analytic requirements.
- Collaborate with cross-functional teams (Data Scientists, Business Analysts, and IT teams) to define data requirements, design solutions, and implement data strategies that align with our clients’ objectives.
- Organize and lead workshops and design sessions with stakeholders, including clients, team members, and cross-functional partners, to capture requirements, understand use cases, personas, key business processes, brainstorm solutions, and align on data architecture strategies and projects.
- Lead the design and implementation of modern data architectures, supporting transactional, operational, analytical, and AI solutions.
- Direct and mentor global data architecture and engineering teams, fostering a culture of innovation, collaboration, and continuous improvement.
- Establish data governance policies and practices, including data security, quality, and lifecycle management.
- Stay abreast of industry trends and emerging technologies in data architecture and management, recommending innovations and improvements to enhance our capabilities.
To qualify for the role, you must have
- A Bachelor’s degree required in STEM
- 12+ years professional consulting experience in industry or in technology consulting.
- 12+ years hands-on experience in architecting, designing, delivering or optimizing data lake solutions.
- 5+ years’ experience with native cloud products and services such as Azure or GCP.
- 8+ years of experience mentoring and leading teams of data architects and data engineers, fostering a culture of innovation and professional development.
- In-depth knowledge of data architecture principles and best practices, including data modelling, data warehousing, data lakes, and data integration.
- Demonstrated experience in leading large data engineering teams to design and build platforms with complex architectures and diverse features including various data flow patterns, relational and no-SQL databases, production-grade performance, and delivery to downstream use cases and applications.
- Hands-on experience in designing end-to-end architectures and pipelines that collect, process, and deliver data to its destination efficiently and reliably.
- Proficiency in data modelling techniques and the ability to choose appropriate architectural design patterns, including Data Fabrics, Data Mesh, Lake Houses, or Delta Lakes.
- Manage complex data analysis, migration, and integration of enterprise solutions to modern platforms, including code efficiency and performance optimizations.
- Previous hands‑on coding skills in languages commonly used in data engineering, such as Python, Java, or Scala.
- Ability to design data solutions that can scale horizontally and vertically while optimizing performance.
- Experience with containerization technologies like Docker and container orchestration platforms like Kubernetes for managing data workloads.
- Experience in version control systems (e.g. Git) and knowledge of DevOps practices for automating data engineering workflows (DataOps).
- Practical understanding of data encryption, access control, and security best practices to protect sensitive data.
- Experience leading Infrastructure and Security engineers and architects in overall platform build.
- Excellent leadership, communication, and project management skills.
- Data Security and Database Management
- Enterprise Data Management and Metadata Management
- Ontology Design and Systems Design
Ideally, you’ll also have
- Master’s degree in Electrical / Power Systems Engineering, Computer science, Statistics, Applied Mathematics, Data Science, Machine Learning or commensurate professional experience.
- Experience working at big 4 or a major utility.
- Experience with cloud data platforms like Databricks.
- Experience in leading and influencing teams, with a focus on mentorship and professional development.
- A passion for innovation and the strategic application of emerging technologies to solve real-world challenges.
- The ability to foster an inclusive environment that values diverse perspectives and empowers team members.
- Building and Managing Relationships
- Client Trust and Value and Commercial Astuteness
- Communicating With Impact and Digital Fluency
What we look for
We are looking for top performers who demonstrate a blend of technical expertise and business acumen, with the ability to build strong client relationships and lead teams through change. Emotional agility and hybrid collaboration skills are key to success in this dynamic role.
FY26NATAID
What we offer you
At EY, we’ll develop you with future-focused skills and equip you with world-class experiences. We’ll empower you in a flexible environment, and fuel you and your extraordinary talents in a diverse and inclusive culture of globally connected teams. Learn more.
- We offer a comprehensive compensation and benefits package where you’ll be rewarded based on your performance and recognized for the value you bring to the business. The base salary range for this job in all geographic locations in the US is $144,000 to $329,100. The base salary range for New York City Metro Area, Washington State and California (excluding Sacramento) is $172,800 to $374,000. Individual salaries within those ranges are determined through a wide variety of factors including but not limited to education, experience, knowledge, skills and geography. In addition, our Total Rewards package includes medical and dental coverage, pension and 401(k) plans, and a wide range of paid time off options.
- Join us in our team‑led and leader‑enabled hybrid model. Our expectation is for most people in external, client serving roles to work together in person 40-60% of the time over the course of an engagement, project or year.
- Under our flexible vacation policy, you’ll decide how much vacation time you need based on your own personal circumstances. You’ll also be granted time off for designated EY Paid Holidays, Winter/Summer breaks, Personal/Family Care, and other leaves of absence when needed to support your physical, financial, and emotional well‑being.
Are you ready to shape your future with confidence? Apply today.
EY accepts applications for this position on an on‑going basis.
For those living in California, please click here for additional information.
EY focuses on high‑ethical standards and integrity among its employees and expects all candidates to demonstrate these qualities.
EY | Building a better working world
EY is building a better working world by creating new value for clients, people, society and the planet, while building trust in capital markets.
Enabled by data, AI and advanced technology, EY teams help clients shape the future with confidence and develop answers for the most pressing issues of today and tomorrow.
EY teams work across a full spectrum of services in assurance, consulting, tax, strategy and transactions. Fueled by sector insights, a globally connected, multi-disciplinary network and diverse ecosystem partners, EY teams can provide services in more than 150 countries and territories.
EY provides equal employment opportunities to applicants and employees without regard to race, color, religion, age, sex, sexual orientation, gender identity/expression, pregnancy, genetic information, national origin, protected veteran status, disability status, or any other legally protected basis, including arrest and conviction records, in accordance with applicable law.
EY is committed to providing reasonable accommodation to qualified individuals with disabilities including veterans with disabilities. If you have a disability and either need assistance applying online or need to request an accommodation during any part of the application process, please call 1-800-EY-HELP3, select Option 2 for candidate related inquiries, then select Option 1 for candidate queries and finally select Option 2 for candidates with an inquiry which will route you to EY’s Talent Shared Services Team (TSS) or email the TSS at .
#J-18808-Ljbffr
Job Summary:
Our client is seeking a Data Steward to join their team! This position is located Hybrid in Creve Coeur, Missouri.
Duties:
- Understand business capability needs and processes as they relate to IT solutions through partnering with Product Managers and business and functional IT stakeholders
- Participate in data scraping, data curation and data compilation efforts
- Ensure high quality of the data to end users
- Ensure high quality of the inhouse data via data stewardship
- Implement and utilize data solutions for data analysis and profiling using a variety of tools such as SQL, Postman, R, or Python and following the team’s established processes and methodologies
- Collaborate with other data stewards and engineers within the team and across teams on aligning delivery dates and integration efforts
- Define data quality rules and implement automated monitoring, reporting, and remediation solutions
- Coordinate intake and resolution of data support tickets
- Support data migration from legacy systems, data inserts and updates not supported by applications
- Partner with the Data Governance organization to ensure data is secured and access is being managed appropriately
- Identify gaps within existing processes and capable of creating new documentation templates to improve the existing processes and procedures
- Create mapping documents and templates to improve existing manual processes
- Perform data discoveries to understand data formats, source systems, etc. and engage with business partners in this discovery process
- Help answer questions from the end-users and coordinate with technical resources as needed
- Build prototype SQL and continuously engage with end consumers with enhancements
Desired Skills/Experience:
- Bachelor's Degree in Computer Science, Engineering, Science, or other related field
- Applied experience with modern engineering technologies and data principles, for instance: Big Data Cloud Compute, NoSQL, etc..
- Applied experience with querying SQL and/orNoSQL databases
- Experience in designing data catalogs, including data design, metadata structures, object relations, catalog population, etc.
- Data Warehousing experience
- Strong written and verbal communication skills
- Comfortable balancing demands across multiple projects / initiatives
- Ability to identify gaps in requirements based on business subject matter domain expertise
- Ability to deliver detailed technical documentation
- Expert level experience in relevant business domain
- Experience managing data within SAP
- Experience managing data using APIs
- Big Query experience
Benefits:
- Medical, Dental, & Vision Insurance Plans
- Employee-Owned Profit Sharing (ESOP)
- 401K offered
The approximate pay range for this position starting at $104,000 - $115,000+ Please note that the pay range provided is a good faith estimate. Final compensation may vary based on factors including but not limited to background, knowledge, skills, and location. We comply with local wage minimums.
At KellyMitchell, our culture is world class. We’re movers and shakers! We don’t mind a bit of friendly competition, and we reward hard work with unlimited potential for growth. This is an exciting opportunity to join a company known for innovative solutions and unsurpassed customer service. We're passionate about helping companies solve their biggest IT staffing & project solutions challenges. As an employee-owned, women-led organization serving Fortune 500 companies nationwide, we deliver expert service at a moment's notice.
By applying for this job, you agree to receive calls, AI-generated calls, text messages, or emails from KellyMitchell and its affiliates, and contracted partners. Frequency varies for text messages. Message and data rates may apply. Carriers are not liable for delayed or undelivered messages. You can reply STOP to cancel and HELP for help. You can access our privacy policy at
**Must be able to be onsite in Farmington, CT 2 days a week for collaboration**
The Opportunity: We are seeking a software engineer/developer or ETL/data integration/big data developer with experience in projects emphasizing data processing and storage. This person will be responsible for supporting the data ingestion, transformation, and distribution to end consumers. Candidate will perform requirements analysis, design/develop process flow, unit and integration tests, and create/update process documentation.
· Work with the Business Intelligence team and operational stakeholders to design and implement both the data presentation layer available to the user community, as well as the underlying technical architecture of the data warehousing environment. · Develop scalable and reliable data solutions to move data across systems from multiple sources in real time as well as batch modes. · Design and develop database objects, tables, stored procedures, views, etc. · Independently analyze, solve, and correct issues in real time, providing problem resolution end-to-end · Design and develop ETL Processes that will transform a variety of raw data, flat files, xl spreadsheets into SQL Databases · Understands the concept of Data marts and Data lakes and experience with migrating legacy systems to data marts/lake · Uses additional cloud technologies (e.g., understands concept of Cloud services like Azure SQL server) · Maintain comprehensive project documentation · Aptitude to learn new technologies and the ability to perform continuous research, analysis, and process improvement. · Strong interpersonal and communication skills to be able to work in a team environment to include customer and contractor technical, end users, and management team members. · Manage multiple projects, responsibilities and competing priorities.
Requirements Experience Needed: · Programming languages, frameworks, and file formats such as: Python, SQL, PLSQL, and VB · Database platforms such as: Oracle, SQL Server, MySQL · Big data concepts and technologies such as Synapse & Databricks · AWS and Azure cloud computing · HVR data replication
Your role and responsibilities
About the Opportunity
IBM Consulting is seeking an accomplished Data & Analytics Associate Partner to accelerate our growth within the Industrial & Communications sectors. This executive role is responsible for shaping client vision, cultivating senior executive relationships, and developing data-driven solutions that enable clients to successfully navigate complex transformation programs.
You will bring together deep industry expertise and IBM’s portfolio of data, analytics, and AI capabilities to help organizations modernize their data ecosystems—migrating from legacy platforms to modern hybrid cloud architectures—while adopting next-generation analytics, GenAI, and agentic AI to strengthen decision-making and deliver measurable business and financial outcomes.
This role is ideal for a seasoned leader who integrates industry depth, consulting excellence, and technical thought leadership, has a strong understanding of competitive market dynamics, and consistently delivers high-impact transformation at scale.
Key Responsibilities
Market Leadership & Growth
Expand IBM’s Data & Analytics presence by identifying new market opportunities, developing differentiated solutions, and building a strong pipeline.
Engage senior client executives to understand strategic priorities and shape data transformation roadmaps aligned to their business and financial goals.
Lead end-to-end sales cycles, including solution definition, proposal leadership, financial structuring, and contract negotiation.
Strategic Advisory & Transformation Delivery
Advise C-suite leaders on strategies to their data estate modernization, advanced analytics, GenAI, and agentic AI to drive business performance.
Architect integrated solutions that include:
Migration from legacy data platforms to modern cloud-based architectures
Data engineering and Information governance
Business intelligence and advanced analytics
GenAI-powered and agentic AI-driven automation and decisioning
Lead complex transformation programs from discovery through delivery, ensuring measurable outcomes and client satisfaction.
Engagement Excellence & Financial Stewardship
Oversee multi-disciplinary delivery teams to ensure high-quality, consistent execution across all program phases.
Manage engagement financials, including forecasting, margin performance, and overall portfolio profitability.
Align right client technologies, industry expertise, and global delivery capabilities to maximize client value.
Practice Building & Talent Development
Recruit, mentor, and grow top-tier consultants, architects, and data specialists.
Build and scale capabilities in data modernization, cloud data engineering, analytics, GenAI, and emerging agentic AI techniques.
Contribute to practice strategy, offering development, and capability growth across the global Data & Analytics team.
Thought Leadership & Market Presence
Stay ahead of sector and technology trends, including cloud modernization, GenAI, agentic system design, regulatory changes, and evolving competitive dynamics.
Represent IBM at industry conferences, client events, webinars, and executive roundtables.
Create original thought leadership—articles, perspectives, point-of-views—that positions IBM as a leading advisor in data and AI-driven transformation.
This position can be preformed anywhere in the US.
"Leaders are expected to spend time with their teams and clients and therefore are generally expected to be in the workplace a minimum of three days a week, subject to business needs."
Required technical and professional expertise
Qualifications
12+ years of experience in consulting, data strategy, analytics, or digital transformation, with strong exposure to the Industrial or Communications sectors.
Hands-on experience modernizing data ecosystems, including migrating from legacy on-premise platforms to modern cloud-native or hybrid cloud architectures.
Deep expertise with major cloud platforms and their data/analytics stacks, including implementation experience with:
AWS (e.g., Redshift, S3, Glue, EMR, Athena, Lake Formation, Bedrock, SageMaker)
Microsoft Azure (e.g., Azure Data Lake, Synapse, Data Factory, Databricks on Azure, Fabric, Cognitive Services)
Google Cloud Platform (e.g., BigQuery, Cloud Storage, Dataflow, Dataproc, Vertex AI)
Experience designing and implementing end-to-end data pipelines, governance frameworks, and analytics solutions on one or more of these platforms.
Strong understanding of GenAI architectures, LLM integration patterns, vector databases, retrieval-augmented generation (RAG), and emerging agentic AI frameworks.
Proven track record of selling, structuring, and delivering large-scale data and AI transformation programs.
Robust technical and functional expertise in data engineering, cloud data platforms, analytics, AI/ML, information management, and governance.
Executive-level communication and presence, with demonstrated ability to influence senior stakeholders and convey complex topics through compelling narratives.
Financial management experience, including engagement economics, forecasting, margin optimization, and portfolio profitability.
Demonstrated leadership in building, scaling, and developing high-performing consulting and technical teams.
Preferred technical and professional experience
IBM is committed to creating a diverse environment and is proud to be an equal-opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, gender, gender identity or expression, sexual orientation, national origin, caste, genetics, pregnancy, disability, neurodivergence, age, veteran status, or other characteristics. IBM is also committed to compliance with all fair employment practices regarding citizenship and immigration status.
#J-18808-Ljbffr
Data Analytics Internship
Los Angeles, CA, USA (Hybrid role)
Part-Time, $17.87/hr, Mid-April 2026 to Mid-August 2026.
DailyLook, a subsidiary of Victoria’s Secret & Co. (NYSE: VSCO) since being acquired in December 2022, is seeking a Data Analytics Intern. This internship offers the opportunity to work across 2 key teams at Dailylook: Demand Planning and Data Growth. The intern will be at the core of the business, leveraging data and analytics to support strategic initiatives and help drive data-informed improvements across operations, inventory planning, and growth initiatives. This is a great chance to gain hands-on experience working with real business data while contributing to impactful decisions!
Qualifications for the Position
- A degree in (or a junior, senior or graduate student pursing a degree in): data science, statistics, computer science, economics (quantitative track), applied analytics, mathematics or business analytics.
- GPA 3.3+ preferred
- Coursework or experience in: Statisical analysis, data analytics, machine learning.
- Experience with database systems, SQL and Python
- Familiarity with BI tools such as Looker or Tableau.
- Exemplary interpersonal communication skills both verbal and written
- Highly motivated, collaborative
- Experience in a Startup or Retail industry is an extra plus!
- An intellectually curious team player with a no-compromises approach to work quality, attention to detail, organization, and the ability to manage multiple priorities and projects in a fast-paced environment
- Self-motivated, detail-oriented, hands-on go-getter with the ability to build and suggest overhaul processes where needed, take initiative, work independently and proactively, multi-task, and remain flexible with changing priorities
- “I’ll find a way!” mindset where you can leverage your autonomy within your role to think outside the box.
- Demonstrated ability to communicate and collaborate effectively across global teams by adapting to diverse cultural norms, respecting time zone differences, and leveraging digital collaboration tools to maintain alignment and productivity
- Skilled in building trust and fostering inclusive communication styles that support clarity, empathy, and shared goals in international work environments
- Ability and willingness to work on-site at our office in Downtown LA at least once a week.
Responsibilities
- Reports to the Planning Team.
- Maintain and migrate existing demand planning and inventory reports to the current BI tool.
- Build and update weekly and monthly dashboards covering product performance, box performance, and styling metrics
- Assist in developing demand planning assumptions and forecasting frameworks (style demand, size curves, inventory flow)
- Build basic planning tools in Google Sheets / BI tools to support: Size curve projections & Product lifecycle tracking
- Conduct assortment and scenario analysis to support predictive demand planning
- Analyze inventory health, sell-through trends, and replenishment opportunities
- Identify optimization opportunities within the current planning workflow and BI infrastructure
- Document demand planning processes and support improvements to internal planning tools.
- Support the team in analyzing marketing and subscription performance, including acquisition, traffic/funnel, CRM, engagement, etc.
- Support migration and setup of analytics tools and platforms to improve tracking of user behavior and marketing performance
- Assist with dashboard updates, reporting, and basic data checks to ensure data quality
- Help monitor A/B tests and experiments for CRM campaigns and website initiatives
- Conduct ad-hoc analyses to provide insights and recommendations for the team
- Document data workflows & the new data infrastructure.
Compensation & Benefits
The pay for this position is $17.87 an hour. This is a non-exempt, part-time position.
DailyLook is proud to provide equal opportunity to all employees and qualified applicants without regard to race, color, religion, national origin or citizenship, age, sex, marital status, ancestry, legally protected physical or mental disability, veteran status, gender identity, sexual orientation or any other basis protected under applicable law.
By applying for this position, the applicant authorizes DailyLook to check all references list on your application and/or resume.
Visa Status: US Citizen or Green Card Only
Location: Irving, TX (Local Candidates Only)
Employment Type: Full-time / Direct Hire
Work Environment: Hybrid (Monday thru Thursday - in office / Friday - at home)
***MUST HAVE 10+ YEARS EXPERIENCE AS A DATA ENGINEER***
***US Citizen or Green Card Only***
The AWS Senior Data Engineer will own the planning, design, and implementation of data structures for this leading Hospitality Corporation in their AWS environment. This role will be responsible for incorporating all internal and external data sources into a robust, scalable, and comprehensive data model within AWS to support business intelligence and analytics needs throughout the company.
Responsibilities:
- Collaborate with cross-functional teams to understand and define business intelligence needs and translate them into data modeling solutions
- Develops, builds and maintains scalable data pipelines, data schema design, and dimensional data modelling in Databricks and AWS for all system data sources, API integrations, and bespoke data ingestion files from external sources. Includes Batch and real-time pipelines.
- Responsible for data cleansing, standardization, and quality control
- Create data models that will support comprehensive data insights, business intelligence tools, and other data science initiatives
- Create data models and ETL procedures with traceability, data lineage and source control
- Design and implement data integration and data quality framework
- Implement data monitoring best practices with trigger based alerts for data processing KPIs and anomalies
- Investigate and remediate data problems, performing and documenting thorough and complete root cause analyses. Make recommendation for mitigation and prevention of future issues.
- Work with Business and IT to assess efficacy of all legacy data sources, making recommendations for migration, anonymization, archival and/or destruction.
- Continually seek to optimize performance through database indexing, query optimization, stored procedures, etc.
- Ensure compliance with data governance and data security requirements, including data life cycle management, purge and traceability.
- Create and manage documentation and change control mechanisms for all technical design, implementations and systems maintenance.
Target Skills and Experience
- Bachelor's or graduate degree in computer science, information systems or related field preferred, or similar combination of education and experience
- At least 10 years' experience designing and managing data pipelines, schema modeling, and data processing systems.
- Experience with Databricks a plus (or similar tools like Microsoft Fabric, Snowflake, etc.) to drive scalable data solutions.
- Experience with SAP a plus
- Proficient in Python, with a track record of solving real-world data challenges.
- Advanced SQL skills, including experience with database design, query optimization, and stored procedures.
- Experience with Terraform or other infrastructure-as-code tools is a plus.
The University of Maryland (UMD) seeks a Manager of Data Analytics Enablement to lead the adoption and modernization of enterprise analytics capabilities that enable trusted, data-informed decision-making across campus.
This is an exciting time to join UMD as we advance enterprise data and analytics through a period of innovative growth and modernization.
This role will play a key part in shaping the future of enterprise business intelligence, advancing Microsoft Power BI and Fabric capabilities, and embedding sustainable data quality and stewardship practices into analytics workflows.
Reporting to the Director of Enterprise Data Services, this position partners with institutional leaders, IT teams, and enterprise stakeholders to deliver reliable data products, consistent metrics, and actionable insights.
The manager will lead a team of data professionals and advance practical, operational governance practices that support trusted analytics and long-term institutional impact.
Key Responsibilities: Lead the strategy, development, and continuous improvement of the university’s enterprise business intelligence environment, including Microsoft Power BI and Microsoft Fabric.
Establish standards, best practices, and architectural patterns for semantic models, dashboards, and analytics delivery.
Guide migration and modernization efforts to ensure scalable, secure, and high-performing analytics solutions.
Develop and manage an analytics intake, prioritization, and delivery framework aligned with institutional priorities.
Define and implement data quality monitoring practices to ensure reliability, accuracy, and consistency of enterprise data assets.
Partner with technical teams to embed validation, monitoring, and observability into data pipelines and lakehouse environments.
Promote consistent metric definitions and collaborate with campus stakeholders to clarify data ownership and stewardship roles.
Support adoption of metadata management, data catalog, and lineage capabilities.
Ensure analytics solutions align with university standards for security, privacy, and responsible data use.
Manage, mentor, and develop a team of analytics and data professionals, fostering a culture of quality, collaboration, and service.
Communicate analytics priorities, progress, and impact to leadership and campus partners.
**This position is considered essential and may be required to work at the normal work location or an alternative location during a major catastrophic event, weather emergency, or other operational emergency to help maintain the continuity of University services.
** **May be required to work evenings, nights, weekends, or different shifts for extended periods.
** KNOWLEDGE, SKILLS, & ABILITIES: Knowledge of data privacy and security principles and practices necessary to protect systems and data from threats.
Knowledge in areas of subject matter expertise such as databases, data modeling, ETL, reporting, data governance practices, metadata management, data stewardship, and/or regulatory compliance.
Skill in SQL or programming/scripting languages (e.g.; Python) used for integrations, data pipelines, report development, and data management.
Skill in adapting communication style to different audiences, including technical, business, and executive stakeholders.
Skill in the use of office productivity software such as Office 365 or Google Workspaces.
Ability to lead presentations and training for large groups.
Ability to manage communications and relationships with technical and business stakeholders.
Ability to collaborate effectively with other Managers, Assistant Directors, and Directors to identify and solve problems, make improvements, and address ongoing issues.
Ability to provide a team with effective direction and support in implementations using standards and techniques that lead to a repeatable and reliable solution.
Ability to ensure documentation standards and procedures are implemented for all team responsibilities.
Ability to define deadlines and manage the quality of the work delivered.
Ability to comprehend and handle interpersonal dynamics, demonstrate empathy towards team members, and effectively manage conflicts or challenging circumstances.
Ability to coach and mentor team members in order to enhance their performance, provide constructive feedback, and support skill development.
Physical Demands: Sedentary work.
Exerting up to 10 pounds of force occasionally and/or negligible amount of force frequently or constantly to lift, carry, push, pull or otherwise move objects.
Repetitive motion.
Substantial movements (motions) of the wrists, hands, and/or fingers.
The worker is required to have close visual acuity to perform an activity such as: preparing and analyzing data and figures; transcribing; viewing a computer terminal; extensive reading.
Minimum Qualifications Education: Bachelor’s degree from an accredited college or university.
Experience: Three (3) years of professional experience supporting the operations, maintenance, and administration of data systems, analytics platforms, or data management programs.
One (1) year leading or supervising professional staff.
Other: Additional work experience as defined above may be substituted on a year for year basis for up to four (4) years of the required education.
Preferences: Demonstrated experience leading business intelligence or enterprise analytics initiatives.
Experience managing or mentoring data professionals in a collaborative team environment.
Strong experience with Power BI and modern data platforms such as Microsoft Fabric, Databricks, or similar cloud-based analytics ecosystems.
Proficiency with SQL and/or Python in support of analytics, data modeling, or data quality initiatives.
Experience implementing or advancing data quality practices, including validation, monitoring, or metric standardization.
Experience supporting practical data governance activities such as establishing shared definitions, coordinating data stewardship, or implementing metadata/catalog tools.
Demonstrated ability to collaborate across diverse stakeholders and translate business needs into scalable analytics solutions.
Strong communication skills with the ability to engage both technical and non-technical audiences.
Experience using Jira or similar tools for work intake, project tracking, and prioritization.
Additional Information: Please note that all positions within the Division of Information Technology (DIT) have an in person component with expected time in our College Park, MD location per week.
Telework is not a guaranteed work arrangement.
Visa Sponsorship Information: DIT will not sponsor the successful candidate for work authorization in the United States now or in the future.
F1 STEM OPT support is not available for this position.
Required Application Materials: Resume, Cover Letter, List of three References Best Consideration Date: March 26, 2026 Open Until Filled: Yes Salary Range: $149,120.00
- $178,944.00 Please apply at: Job Risks: Not Applicable to This Position Financial Disclosure Required: No For more information on Financial Disclosure, please visit Maryland's State Ethics Commission website .
Department: DIT-EE-Enterprise Data Services Worker Sub-Type: Staff Regular Benefits Summary: For more information on Regular Exempt benefits, select this link .
Background Checks: Offers of employment are contingent on completion of a background check.
Information reported by the background check will not automatically disqualify anyone from employment.
Before any adverse decision, the finalist will have an opportunity to provide information to the University regarding disclosable background check information.
The University reserves the right to rescind the offer of employment or otherwise decline or terminate employment if the information reported by the background check is deemed incompatible with the position, regardless of when the background check is completed.
Employment Eligibility: The successful candidate must complete employment eligibility verification (on Form I-9) by presenting documents that establish identity and work authorization within the timeframe required by federal immigration law, and where applicable, to demonstrate renewed employment authorization.
Failure to complete employment eligibility verification or reverification within the timeframe set forth by law may result in suspension or termination of employment.
EEO Statement : The University of Maryland, College Park is an Equal Opportunity Employer.
All qualified applicants will receive equal consideration for employment.
Please read the University’s Equal Employment Opportunity Statement of Policy.
Title IX Non-Discrimination Notice See above description for requirements
Job Description Summary
We are seeking a highly skilled and experienced Senior Data Engineer to join our dynamic team. In this role, you will be instrumental in designing, building, and maintaining robust and scalable data pipelines and solutions within the Microsoft Azure ecosystem. You will be responsible for developing and optimizing ETL/ELT processes, ensuring data quality, and enabling efficient data access for analytics and business intelligence. We are looking for a hands-on engineer who thrives in a fast-paced environment and is passionate about leveraging cutting-edge technologies
Key Responsibilities:
Design, develop, and maintain cloud-based data pipelines and ETL/ELT workflows.
Build and optimize data architectures to support structured and unstructured data processing.
Collaborate with data analysts, data scientists, and business stakeholders to understand data needs.
Implement data quality, security, and governance best practices.
Monitor and troubleshoot data workflows to ensure high availability and performance.
Optimize database and data storage solutions for performance and cost efficiency.
Contribute to cloud adoption, migration, and modernization initiatives.
Mandatory Skills:
Strong expertise with Azure cloud platform.
Strong experience in Databricks
Azure Data Factory proficiency required; building datasets, data flows, and pipelines in ADF (not just maintaining something already built)
Hands-on experience with ETL/ELT tools and frameworks.
Proficiency in SQL, Python, and data modeling.
Knowledge of CI/CD pipelines and infrastructure-as-code tools.
Understanding of data governance, security, and compliance.
Preferred Skills:
Exposure to API integration and microservices architecture.
Strong analytical and problem-solving skills.
Azure cloud certifications and/or past experience
AKS (Azure Kubernetes Service) experience, and ETL related to applications containerized & deployed on AKS (or EKS)
Job Description: The State of Connecticut (CT) is seeking a Digital Accessibility Web Developer with deep experience in remediating accessibility issues across a wide range of platforms and technologies.
You will partner closely with our accessibility testers and analysts to turn accessibility audit findings into fully remediated digital experiences that meet or exceed compliance standards.
The ideal candidate will have expert-level experience remediating accessibility barriers in CMS systems such as Sitecore, Salesforce, and custom web applications (HTML/ARIA/CSS/JavaScript), as well as working knowledge of AWS services, Biznet platforms, and enterprise databases.
You will be hands-on in HTML and accessibility markup remediation, working primarily within the State's CMS platforms and custom HTML environments.
You'll partner with digital accessibility testers to review audit findings and make front end code corrections to ensure WCAG 2.1 AA compliance.
Remediation Focus Areas Apply accessibility fixes to front-end code and markup issues identified through audits (i.e.
color corrections, alt text, heading structure, keyboard navigation, link roles, ARIA roles) Modify and restructure HTML, CSS, and ARIA to comply with WCAG 2.1 AA standards Work within CMS platforms like Sitecore, Salesforce, and Wordpress to correct issues in templates, content types, and presentation layers Support content and design teams with accessibility guidance for remediating documents, forms, and embedded media Use defect tracking tools (JIRA) to manage tickets and document fixes Collaborate with accessibility testers and content strategists to validate remediated work and prevent recurrence of issues Share knowledge and remediation patterns with other developers to promote consistency and sustainability Required Knowledge, Skills, and Ability Bachelor's degree in Computer Science, Software Engineering, IT, or related field 4 years of experience remediating digital accessibility issues in websites, apps, and platforms Strong coding experience in HTML, CSS, JavaScript, and ARIA markup Working knowledge of Sitecore and Salesforce platforms, with demonstrated remediation success Familiarity with Biznet applications, AWS infrastructure, or common enterprise back-end platforms Ability to interpret automated and manual testing results (e.g., Axe, ANDI, NVDA, JAWS) and apply solutions Expert knowledge of WCAG 2.1 AA standards and assistive technology interactions Proficiency in CMS templates, JavaScript frameworks, backend API configuration, and UI component libraries Experience troubleshooting keyboard traps, focus management, form label/field logic, and responsive layouts Strong ability to work in agile sprints, manage remediation tickets, and track progress in Jira or similar tools Ability to collaborate with QA testers, content editors, and project managers in an agile environment Excellent communication and documentation skills for communicating fixes and coaching teams Preferred Skills and Qualifications Experience with Sitecore MVC or SXA customization Front-end developer or CMS certifications Accessibility remediation tools Experience with customized CMS themes, templates, and components Strong attention to content structure (heading levels, alt text, semantic HTML) Experience remediating PDF, Word, or PowerPoint documents (for secondary support) Familiarity with CI/CD integration of accessibility checks (i.e., axe-core in pipelines) Familiarity with design handoff tools (i.e., Figma or Adobe XD) for accessibility review Desired Certifications One or more of the following: IAAP WAS (Web Accessibility Specialist) strongly preferred IAAP CPACC DHS Trusted Tester Certification Deque University Developer Track Certificate Salesforce Accessibility Champion or similar Prior PowerCenter → IDMC migration, Experience or familiarity with Linux system administration activities