Gpt Models Api List Jobs in Usa
7,430 positions found — Page 11
IFBF is Iowa's largest farm organization, established in 1918.
We remain a statewide, non-profit, grassroots farm organization dedicated to creating a vibrant future for agriculture, farm families, and rural communities.
The Information Resources department is responsible for creating systems to manage memberships and support the ongoing business of Iowa Farm Bureau.
What You'll Do: We are seeking an experienced and skilled Senior Full Stack Developer with expertise in Azure, C#, .NET, SQL, API integration, and frontend development frameworks like Angular.
As a senior developer, you will play a pivotal role in designing, developing, and deploying scalable web applications and cloud-based solutions that support our business needs.
You will work closely with cross-functional teams to ensure our applications are secure, high-performing, and user-friendly, utilizing best practices in cloud architecture, API management, and identity management via Azure Entra ID.
You will also: • Architect, design, and develop full stack applications and APIs using C#, .NET, SQL, and Angular for both internal and external-facing applications.
• Leverage Azure cloud services, including Azure App Services, Azure Functions, Azure SQL, and Azure Storage, to build scalable, reliable applications.
Develop, deploy, and manage RESTful APIs that enable data and functionality sharing across platforms, ensuring optimal performance and scalability.
Implement authentication and authorization using Azure Entra ID, including single sign-on, multi-factor authentication, and role-based access control (RBAC).
Work with SQL Server and other database systems to design schemas, optimize queries, and manage database performance.
Build and maintain user interfaces using Angular and other frontend frameworks, ensuring a responsive, consistent, and user-friendly experience.
Ensure the quality and reliability of code through best practices, including unit testing, integration testing, code reviews, and adherence to coding standards.
Provide comprehensive documentation for applications, APIs, and systems architecture; support troubleshooting and performance optimization as needed.
Mentor junior developers, participate in code reviews, and collaborate with cross-functional teams to align technology solutions with business goals.
What It Takes to Join Our Team: • Bachelor's degree in Computer Science, Information Technology, or related field.
• 5+ years of experience in full stack development withy a focus on Azure, C#, .NET, and Angular.
• Strong proficiency in C#, .NET, Azure, SQL, API Design, Angular and Azure Entra ID required.
• Strong analytical and problem-solving skills, with a solution-oriented mindset.
• Ability to work both independently and collaboratively in a team environment.
• Excellent communication and documentation skills.
Experience with DevOps practices and tools, such as Azure DevOps, CI/CD pipelines, and version control (Git) preferred.
Familiarity with containerization (Docker) and orchestration (Kubernetes) in the Azure ecosystem preferred.
Experience in optimizing cloud architecture for cost-effectiveness and scalability preferred.
What We Offer You: When you're on our team, you get more than a great paycheck.
You'll hear about career development and educational opportunities.
We offer an enhanced 401K with a match, a defined benefit plan, low-cost health, dental, and vision benefits, and life and disability insurance options.
We also offer paid time off, including holidays and volunteer time, and teams who know how to have fun.
Add to that an onsite wellness facility with fitness classes and programs, a daycare center, and a cafeteria.
Iowa Farm Bureau....where the grass really IS greener! Work Authorization/Sponsorship: Applicants must be currently authorized to work in the United States on a full-time basis.
We are not able to sponsor now or in the future, or take over sponsorship of, an employment visa or work authorization for this role.
For example, we are not able to sponsor OPT status
About Pinterest:
Millions of people around the world come to our platform to find creative ideas, dream about new possibilities and plan for memories that will last a lifetime. At Pinterest, we're on a mission to bring everyone the inspiration to create a life they love, and that starts with the people behind the product.
Discover a career where you ignite innovation for millions, transform passion into growth opportunities, celebrate each other's unique experiences and embrace theflexibility to do your best work. Creating a career you love? It's Possible.
At Pinterest, AI isn't just a feature, it's a powerful partner that augments our creativity and amplifies our impact, and we're looking for candidates who are excited to be a part of that. To get a complete picture of your experience and abilities, we'll explore your foundational skills and how you collaborate with AI.
Through our interview process, what matters most is that you can always explain your approach, showing us not just what you know, but how you think. You can read more about our AI interview philosophy and how we use AI in our recruiting process here.
About tvScientific
tvScientific is the first and only CTV advertising platform purpose-built for performance marketers. We leverage massive data and cutting-edge science to automate and optimize TV advertising to drive business outcomes. Our solution combines media buying, optimization, measurement, and attribution in one, efficient platform. Our platform is built by industry leaders with a long history in programmatic advertising, digital media, and ad verification who have now purpose-built a CTV performance platform advertisers can trust to grow their business.
We are seeking a Staff Product Manager to lead the strategy and execution for identity graph and data partnership initiatives, critical to enabling high-performance, privacy-compliant targeting across our CTV advertising platform. This role will focus on developing and refining identity resolution capabilities, managing graph-based data integrations, and expanding the reach and accuracy of our audience recognition and measurement infrastructure.
Success in this role will require a blend of deep technical expertise in identity data, graph modeling, and data architecture, as well as strong product instincts and cross-functional leadership skills. You will work closely with Engineering, Data Science, and external data partners to build a resilient and scalable identity foundation for precise audience targeting and measurement.
What you'll do:
- Own the identity product strategy at tvScientific
- Lead the product vision for tvScientific's identity graph, enabling persistent, multi-device recognition across CTV and digital channels.
- tvSci Identity will service multiple teams throughout the product and engineering ecosystem, it will be your role to align with leadership of those teams to gather requirements, define goals and monitor success.
- Partner with Data Engineering and Data Science to architect and optimize graph-based data models that represent user identity, household relationships, and device linkages.
- Design APIs and services for real-time identity resolution, enrichment, and activation in programmatic ad workflows.
- Grow identity data partnerships
- Source, evaluate, and onboard third-party identity and behavioral data providers to enhance graph completeness and targeting capabilities.
- Work with Legal, Security, and Data teams to ensure all data partnerships comply with CCPA, GDPR, and other global privacy standards.
- Lead the technical integration and operationalization of new identity and graph enrichment partners, ensuring reliable ingestion, mapping, and deployment.
- Maintain an ongoing view of the identity and data ecosystem, and recommend partnership or build strategies accordingly.
- Deliver world-class adtech product
- Write detailed product requirements, data specifications, and user stories for identity graph services and data integration projects.
- Coordinate with Engineering and Infrastructure teams to deliver performant graph storage, traversal, and querying systems.
- Support Sales, Marketing, and Customer Success with technical narratives that explain the role and value of identity resolution in CTV targeting.
- Define and monitor key metrics related to graph quality (e.g., match rates, accuracy, persistence), identity coverage, and performance impact.
- Drive Industry Leadership
- Stay current with advancements in privacy-enhancing technologies (PETs), identity standards, and regulatory shifts impacting identity data use in advertising.
- Represent tvScientific in industry forums and with partners to position the company as a leader in CTV identity and data interoperability.
What we're looking for:
- Experience in product management, technical partnerships, or solutions engineering roles focused on data-driven products, audience targeting, or marketing technology.
- Strong background working with Data Engineering and Data Science teams to operationalize audience strategies.
- Expertise in audience segmentation, identity resolution, data onboarding, and activation workflows.
- Experience sourcing, integrating, and managingthird-party data partnerships.
- Wide array of data analytics experience and a tenacity for driving to comprehension and organization of large datasets.
- Solid technical acumen - including APIs, data pipelines, audience graphs, and privacy frameworks. Ideal candidates should be able to operate directly on the datasets without engineering support.
- Exceptional communication skills, translating technical details into business value.
- Experience within the adtech ecosystem is required, with Connected TV (CTV) experience a strong plus.
In-Office Requirement Statement:
- We recognize that the ideal environment for work is situational and may differ across departments. What this looks like day-to-day can vary based on the needs of each organization or role.
Relocation Statement:
- This position is not eligible for relocation assistance. Visit our PinFlex page to learn more about our working model.
#LI-REMOTE
At Pinterest we believe the workplace should be equitable, inclusive, and inspiring for every employee. In an effort to provide greater transparency, we are sharing the base salary range for this position. The position is also eligible for equity. Final salary is based on a number of factors including location, travel, relevant prior experience, or particular skills and expertise.
Information regarding the culture at Pinterest and benefits available for this position can be found here.
US based applicants only$164,695—$339,078 USDOur Commitment to Inclusion:
Pinterest is an equal opportunity employer and makes employment decisions on the basis of merit. We want to have the best qualified people in every job. All qualified applicants will receive consideration for employment without regard to race, color, ancestry, national origin, religion or religious creed, sex (including pregnancy, childbirth, or related medical conditions), sexual orientation, gender, gender identity, gender expression, age, marital status, status as a protected veteran, physical or mental disability, medical condition, genetic information or characteristics (or those of a family member) or any other consideration made unlawful by applicable federal, state or local laws. We also consider qualified applicants regardless of criminal histories, consistent with legal requirements. If you require a medical or religious accommodation during the job application process, please completethis formfor support.
AI Data & Python Tools Engineer
We're seeking an AI Data and Python Tools Engineer to develop and deploy intelligent tools that leverage big data infrastructure and modern AI architecture. This role combines strong software engineering fundamentals with the ability to build production-ready AI applications at speed, including integration with Model Context Protocol (MCP) systems.
Responsibilities:
- Develop and deploy AI-powered full-stack applications using Python, React, and modern machine learning frameworks
- Design and streamline data pipelines, train and validate ML models, and implement robust evaluation methods
- Collaborate with cross-functional teams to solve complex problems and integrate scalable, cloud-based AI solutions
- Rapidly prototype, test, and iterate on AI tools with a strong focus on performance, flexibility, and scalability
- Maintain clear technical documentation, perform code reviews, and support the full software development lifecycle
Software Engineering & AI/ML Data, Tools Development
- 3+ years of Python Development with a background in back end services and data processing
- Exposure to AI/ML algorithms
- Familiarity with ML frameworks (TensorFlow, PyTorch, scikit-learn)
- Understanding of LLMs, vector databases, and retrieval systems
- Experience with Model Context Protocol (MCP) integration and server development
Big Data & Cloud Infrastructure
- Knowledge of building and deploying cloud based applications
- Hands-on experience with cloud data platforms (AWS/GCP/Azure)
- Proficiency with big data technologies (Spark, Kafka, or similar streaming platforms)
- Experience with data warehouses (Snowflake, BigQuery, Redshift) and data lakes
- Knowledge of containerization (Docker/Kubernetes) and infrastructure as code
*Preferred Experience
- Experience building web applications with modern frameworks (React, Vue, or Angular)
- API development and integration experience
- Basic UX/UI design sensibilities for internal tooling
- Experience with real-time data processing and analytics
- Background in building developer tools or internal platforms
- Familiarity with AI/ML operations (MLOps) practices (Experience using airflow)
- Experience building MCP servers and integrating with AI assistants
- Knowledge of structured data exchange protocols and API design for AI systems.
Type: Full Time
Location: Austin, TX or Cupertino, CA (Monday- Friday onsite)
*Relocation assistance can be offered based on individual needs and circumstances*
We are seeking a talented Software Engineer 3 (Power BI Developer) to join a leading global financial institution on a long-term contract in Wilmington, DE. This role is ideal for someone with advanced Power BI skills, including DAX, Power Query/M, and complex data modeling, who has experience building executive dashboards and turning complex data into actionable insights. The position involves designing enterprise-level BI solutions, integrating data from multiple sources, and delivering analytics on toolchain adoption, productivity, and business impact. Candidates should have experience with platforms such as Jira, GitHub, Azure DevOps, and CI/CD tools, and be comfortable mentoring junior team members and collaborating with cross-functional teams. This is an exciting opportunity to influence decision-making and contribute to strategic initiatives at a senior level.
Job Title: Software Engineer 3 (Power BI Developer)
Job Location: Wilmington, Delaware 19803
Job Duration: 12 months (with possible extension)
Only W2 Candidates
Join a leading global financial institution and work with some of the brightest minds in the industry. This long-term contract opportunity offers a competitive benefits package and a chance to contribute to innovative solutions in the financial services space. If you’re passionate about leveraging data to drive business impact and enjoy creating insights that influence key decisions, this role is for you.
Required Skills & Experience
- 4+ years of software engineering experience, or equivalent through consulting, training, military service, or education.
- 6+ years of Power BI experience, with at least 3 years focused on advanced development in enterprise environments.
- Proven expertise in designing BI solutions for enterprise software development ecosystems, toolchain adoption, and DevOps maturity.
- Experience connecting Power BI to various toolchain platforms (e.g., Jira, GitHub, Azure DevOps, CI/CD tools) and designing KPIs for adoption, onboarding, and usage.
- Advanced proficiency in DAX, Power Query/M, and complex data modeling for management-level reporting.
- Experience building executive dashboards covering adoption, risk, compliance, automation, productivity, and cost savings.
- Strong data integration skills, including ETL, API extraction, direct query, and on-prem/cloud data source integration.
- Deep understanding of enterprise data governance, security, access controls, and reporting best practices.
- Excellent communication skills with experience collaborating with both technical and business stakeholders.
- Demonstrated leadership in project delivery, solution architecture, and mentoring junior team members.
Desired Skills & Experience
- Expertise in enterprise DevOps, SDLC/ALM toolchains, engineering productivity tooling, or related reporting domains.
- Experience supporting executive or board-level reporting initiatives.
- Microsoft Power BI and/or Power Platform certification.
- Experience in highly regulated or financial services environments.
Key Responsibilities
- Participate in moderately complex software engineering initiatives and contribute to planning and delivery of enterprise solutions.
- Review, analyze, and resolve complex software engineering and BI challenges.
- Collaborate with engineering, operations, and transformation teams to gather requirements, define key metrics, and ensure data accuracy for management reporting.
- Architect, develop, and maintain advanced Power BI dashboards and reports focused on toolchain adoption, process maturity, and business impact.
- Serve as the enterprise subject matter expert in toolchain reporting, with knowledge of common platforms such as Jira, GitHub, Azure DevOps, and CI/CD tools.
- Develop frameworks, data models, and methodologies to assess adoption and maturity metrics (e.g., tool usage, process adherence, automation coverage, delivery impact).
- Integrate data from multiple sources—including APIs, data lakes, internal databases, and vendor platforms—into Power BI using advanced transformations and DAX.
- Deliver meaningful executive and operational insights with robust drill-down capabilities for decision-making.
- Partner with business and IT leadership to present findings, recommend actions, and evolve analytics in alignment with strategic objectives.
- Define, document, and enforce best practices for management reporting, including data governance, security, and lifecycle management.
- Mentor and coach junior engineers and analysts on Power BI and toolchain reporting best practices.
- Maintain, monitor, and continuously enhance reporting solutions as enterprise needs evolve.
- Provide occasional after-hours support for critical reporting or deployment issues.
Are you an experienced Back End Developer with a desire to excel? If so, then Talent Software Services may have the job for you! Our client is seeking an experienced Back End Developer to work at their company in Richfield, MN.
Position Summary: We are seeking a DevOps Engineer to join our Enterprise API Management team. The successful candidate will be responsible for building, deploying, and operating our platform and services, with an emphasis on automation, reliability, and secure-by-default delivery. This role partners closely with engineers through collaboration and pair programming to improve CI/CD pipelines, troubleshooting practices, and operational readiness, and it also requires some software development experience (e.g., ability to read/debug code and contribute when needed). Technologies involved include Kubernetes, Helm charts, Java/Spring Boot, AWS offerings (certification preferred), and API platform capabilities such as API Gateway and security.
Qualifications:
- Kubernetes 5+ Years
- Helm Charts 4+ Years
- AWS (core offerings; certification preferred) 5+ Years
- Java and Spring Boot 5+ Years
- API Security 5+ Years
Preferred:
- CI/CD and GitOps
- Infrastructure-as-Code experience (e.g., Terraform/CloudFormation)
- API Gateway/API Management experience
- Observability tooling (logging/metrics/tracing) and on-call readiness
- Lua Programming
- Collaborative delivery practices (pair programming, code reviews)
We are seeking a Senior Lead Developer to lead the development and deployment of our backend services. In this role, you will be the bridge between our PostgreSQL database and React frontend, responsible not only for writing high-performance Python code but also for architecting the CI/CD pipelines that bring our applications to life. You will ensure our integration layers are scalable, secure, and automatically deployed.
Key Responsibilities
• API & Backend Development: Design and maintain production-grade RESTful APIs using Python (FastAPI, Flask) with a focus on asynchronous processing.
• Database Engineering: Architect relational schemas and write optimized SQL in PostgreSQL, ensuring data integrity and query performance.
• React Integration: Partner with frontend teams to define API contracts, handle state-consistent data fetching, and implement secure authentication (JWT/OAuth2).
• CI/CD & Deployment: Build and manage automated deployment pipelines (e.g., Azure DevOps or Jenkins) to move code from local environments to staging and production.
• Containerization & Cloud: Package applications using Docker and manage deployments on cloud platforms or container orchestrators (Kubernetes/ECS).
• System Reliability: Implement automated testing (PyTest), logging, and monitoring to ensure high availability of integration services.
Technical Requirements
• Experience: 10+ years of professional backend development with a heavy emphasis on Python and API architecture.
• PostgreSQL Expert: Advanced SQL knowledge, including indexing strategies, migrations (Alembic/Flyway), and performance profiling.
• DevOps Tooling: Hands-on experience with Docker and building CI/CD pipelines for Python applications.
• Frontend Literacy: Solid understanding of React (Hooks, Context API) and how it consumes complex JSON structures.
• Infrastructure as Code (Bonus): Familiarity with Terraform or AWS CloudFormation is a significant plus.
The "Lead" Expectation
At the 10-year mark, we expect more than just "feature delivery." We are looking for a candidate who:
• Automates Everything: If a task is done twice, they write a script or a CI job for it.
• Designs for Failure: Implements proper error handling, retries, and health checks in the API layer.
• Collaborates Across the Stack: Can jump into a React component or a Postgres execution plan to find the root cause of a bottleneck.
Job description:-
Position: Gen AI Engineer
Location: Irving TX
Duration: 12+ months
Job Overview
In this role, you will be responsible for translating AI strategy into tangible, production-ready capabilities that enhance operational efficiencies and drive business value. We're looking for someone who combines deep technical expertise in generative AI with a proven track record of successfully delivering complex technology projects.
Required Technical Skills
Deep knowledge of LLMs and advanced fine-tuning techniques. Proficient in Parameter-Efficient Fine-Tuning (PEFT) methods (LoRA, QLoRA, Adapter Tuning, Prefix Tuning), full fine-tuning, instruction tuning, and agentic AI techniques (RLHF, multi-task learning).
Expertise in model compression and quantization methods (AWQ, GPTQ, GPTQ-for-LLaMA). Proficiency with optimized inference engines such as vLLM, DeepSpeed, and FP6-LLM.
Adept at advanced prompt engineering techniques and best practices. Familiarity with frameworks that facilitate effective prompt design and management.
Advanced knowledge of RAG techniques, including hybrid search, multi-vector retrieval, Hypothetical Document Embeddings (HyDE), self-querying, query expansion, re-ranking, and relevance filtering.
Proficiency in TensorFlow, PyTorch, and Keras. Knowledge of distributed training, parallel processing, and extensive hands-on experience with AWS services for AI/ML.
Advanced NLP skills (NER, Dependency Parsing, Text Classification, Topic Modeling). Experience with Transfer Learning, Few-shot, and Zero-shot learning. Expertise in containerization (Docker), orchestration (Kubernetes), and CI/CD pipelines for MLOps.
Strong proficiency in data preprocessing, feature engineering, and handling large-scale datasets. Experience with real-time AI applications, streaming data, and designing RESTful APIs for model integration.
Experienced with LangGraph, Autogen, , LangChain, LlamaIndex, and Hugging Face Transformers. Familiarity with Gen AI APIs (OpenAI, Gemini, Claude) and version control systems like Git.
Knowledge of AI compliance frameworks and best practices. Experience implementing guardrails to ensure ethical AI usage and mitigate risks (e.g., Microsoft's AI Guidance Framework).
Dexian is a leading provider of staffing, IT, and workforce solutions with over 12,000 employees and 70 locations worldwide. As one of the largest IT staffing companies and the 2nd largest minority-owned staffing company in the U.S., Dexian was formed in 2023 through the merger of DISYS and Signature Consultants. Combining the best elements of its core companies, Dexian's platform connects talent, technology, and organizations to produce game-changing results that help everyone achieve their ambitions and goals.
Dexian's brands include Dexian DISYS, Dexian Signature Consultants, Dexian Government Solutions, Dexian Talent Development and Dexian IT Solutions. Visit to learn more.
Dexian is an Equal Opportunity Employer that recruits and hires qualified candidates without regard to race, religion, sex, sexual orientation, gender identity, age, national origin, ancestry, citizenship, disability, or veteran status.
Note: Dexian Canada will, on request, provide accommodations for disabilities to support your participation in all aspects of our Recruitment and Assessment/Selection Processes.
About the Role
We are seeking a Backend Engineer to help build and maintain the backend services and API’s that power our proprietary AI SaaS CRM and LMS platforms.
You will work directly with our CTO, collaborate with the engineering team, and partner closely with our Product Manager to design, implement, and maintain scalable backend systems.
Our backend services are built primarily with:
- NestJS (TypeScript)
- Python
- Deployed across multiple AWS environments
This is a hands-on backend engineering role focused on API development, cloud deployment, distributed systems, and production-grade reliability.The role has meaningful ownership - not just ticket execution.
What You’ll Do
- Work directly with the CTO on backend design and implementation decisions
- Partner closely with a Product Manager on sprint planning, backlog grooming, translating product requirements into technical solutions, and prioritizing customer-impacting improvements
- Design, build, and maintain backend API services using NestJS (TypeScript)
- Build and support backend services in Python
- Develop and maintain production-grade RESTful APIs
- Contribute to multi-environment deployments across AWS
- Use Terraform to manage our IAC
- Work with CI/CD workflows and structured deployment procedures
- Follow and contribute to engineering documentation including development guidelines, environment configuration standards, security practices, and versioning and changelog management
- Implement and support asynchronous and event-driven systems
- Write clean, maintainable, well-tested code
- Participate in code reviews and maintain high engineering standards
- Debug and resolve production issues across distributed cloud environments
What We’re Looking For (Required)
- 5+ years of backend engineering experience
- Strong proficiency in TypeScript and experience with NestJS
- Strong proficiency in Python
- Experience designing and implementing RESTful APIs
- Experience deploying and maintaining applications in AWS
- Familiarity with multi-environment deployments (dev, staging,UAT, production)
- Experience working with CI/CD pipelines
- Experience with relational databases (PostgreSQL)
- Familiarity with Docker or containerized workflows
- Experience working in GitHub-based workflows in a collaborative environment (pull requests, code reviews, branching strategies, and issue tracking)
- Comfortable working in an agile environment with JIRA and Monday
- Strong communication and problem-solving skills
- Experience building SaaS or multi-tenant platforms
Nice to Have / Strong Plus
- Familiarity with C# & C++
- Experience with Dentrix, OpenDental, or other dental integration PMS’s
- Experience building a greenfield SaaS or B2B software
- Experience with building on a Healthcare platform
- Familiarity with AI-enabled products or LLM integrations
- Experience with Redis or caching strategies
- Experience integrating third-party APIs
Why This Role Is Different
- Direct collaboration with the CTO on backend system design
- Close partnership with Product Management
- Opportunity to help shape a modern, AI SaaS platform for the healthcare industry
Head of Business Operations
Brief Summary
The Head of Business Operations owns the configuration, integrity, and scalability of the company's business operations systems, serving as the bridge between business strategy and technical execution reporting directly to the CEO/Co-Founder. This role is responsible for translating institutional knowledge into scalable business processes,
ensuring data integrity, and enabling the transition from ad-hoc decision making to data-driven workflows. This is a senior management role with individual-contributor responsibilities, broad cross-functional authority, and high executive visibility.
The Head of Business Operations will take a lead role in defining the data architecture, implementing process guardrails, and analyzing operational data to drive strategy. This person acts as the cross-functional orchestrator of the business operations system, collaborating with Sales, Production, and Leadership to extract & refine business logic and codify it into streamlined processes. Success in this role requires a strong backbone to enforce higher standards, and an analytical and systems-thinking mindset to visualize downstream effects.
What Success Looks Like
● All core workflows are analyzable, have entrance/exit criteria, and are governed by continuously improving SOPs
● Leadership can answer key operational questions without ad-hoc data pulls
● Administrative overhead for sales and production staff is measurably reduced through intuitive, user-centric workflow design and automation.
● Data integrity is proactively enforced through automated validation gates, ensuring all transactions reaching Production meet technical completeness standards
● Schema changes follow a formal change process without disruptive production breakage
● Cross-team handoffs show measurable reductions in rework or delays
● Operational reporting has shifted from reactive status checks to predictive insights, providing automated triggers for churn risks and production bottlenecks
Duties & Responsibilities
Requirements Engineering (Internal Product Owner)
● Conduct structured interviews with stakeholders (Sales, Production) to extract complex business logic, transforming qualitative requirements into workflow pipelines, binary system gates, and automation triggers.
● Treat internal tools as a "Product" and internal staff as "Users," conducting user research to ensure workflows are intuitive and reduce friction.
● Act as the liaison between business stakeholders and technical teams to ensure alignment.
● Define, mandate, and manage the company's "Data Dictionary" and Standard Operating
Procedures (SOPs), ensuring a unified language and common framework is adopted across all functional teams.
System Ownership & Platform Governance
● Own the configuration and architecture of the company’s operating platform (currently ), defining object relationships and preventing schema drift.
● Translate strategic business objectives into system logic, automation rules, and workflows to create a scalable operating platform that generates measurable, actionable data.
● Define and enforce strict "Entrance and Exit Criteria" for all business process stages to prevent data errors (the enforcement aspect).
● Manage the change control process for system updates to prevent disruption to active workflows.
Business Intelligence
● Responsible for building decision-grade operational reporting and analysis (but not exploratory data science/research or data engineering).
● Query and analyze cross-functional data to drive strategic business decisions, identify performance gaps, and uncover opportunities for revenue optimization and growth (e.g., ROAS, marketing attribution, churn risks, customer LTV).
● Own and facilitate the weekly business review, working with management and leads to refine reporting and insights across the organization.
● Design and maintain management reporting dashboards to track key performance indicators and operational health.
Decision Authority
This role has final decision authority over the following areas:
● Operating system structure and data definitions
● Workflow stage definitions and gating logic
● Approval or rejection of system changes that affect data integrity
Desired Qualifications & Traits
● Systems Thinker: Possesses strong systems thinking capabilities, naturally visualizing the downstream effects of upstream changes (e.g., how a change in the Sales form affects the Production floor). They prioritize long-term scalability over short-term "hacks."
● Pragmatic Architect: Maintains a pragmatic approach to architecture, balancing "perfection with business utility." They know when to implement a rigid constraint and when to allow manual flexibility, always focused on delivering high-utility features.
● Operational Excellence Steward: Demonstrates operational discipline and the ability to define, promote, and enforce process compliance among diverse teams. They value consistency and predictability and are willing to say "No" when requests threaten system integrity and guide the team to the right trade-off.
● Analytical & Problem-Solving Mindset: Possesses an investigative nature, focusing on finding root causes and proactively hunting for "process leaks" and undefined variables. They validate assumptions with data rather than anecdotes.
● Coach & Change Leader: Possesses high emotional intelligence and the teaching ability to re-program legacy habits. They can explain why a new system is better to resistant teams and guide them through the transition with patience and clarity.
● Ambiguity Simplifier: Has the ability to simplify ambiguity, taking chaotic business inputs and structuring the information into linear, standardized processes.
● Translator & Data-Centric Communicator: Has strong communication skills to fluently bridge the gap, explaining technical constraints to non-technical stakeholders in plain English.
● Detail-Oriented: Is highly detail-oriented, obsessed with consistent naming conventions and data definitions. They notice misalignment in data definitions immediately, ensuring organizational clarity and data integrity.
Experience & Educational Requirements & Preferences
Experience & Educational Background
● 7+ years of experience in Business Operations, Systems Administration, or Data Analysis.
● Bachelor’s degree in Business, Information Systems, or related field required, Master's degree preferred.
● People Management and Team Building
Platform Expertise & Architecture
● Low-Code/No-Code Mastery: Advanced proficiency with Low-Code/No-Code platforms ( , Airtable, Salesforce) is required, including the management of complex automation rules, dependencies, and integration webhooks.
● Business Object Modeling / Relational Database Design: Proven experience designing relational database schemas (One-to-Many, Many-to-Many), specifically including the ability to translate flat spreadsheets into relational objects (e.g., separating "Orders" from "Line Items").
● API & Integration Knowledge: Ability to read API documentation to understand system
capabilities/limitations.
● Lightweight Scripting & Automation (Preferred): Proficiency with basic data-related scripting (Python, SQL) or advanced spreadsheet macros (VBA) to independently manipulate datasets or prototype logic is a strong plus.
Process, Intelligence, & Change Management
● Business Process Modeling (BPM): Experience with Business Process Modeling (BPM), including creating detailed swimlane diagrams to visualize hand-offs and defining strict "Entrance and Exit Criteria" for process stages.
● Business Intelligence (BI) & Reporting: Proficiency in designing Business Intelligence (BI) dashboards and reports, with an understanding of how to structure data for customer segmentation and cohort analysis.
● Change Management & Training: Experience managing change, designing rollout plans, and creating training materials and SOPs for users in a fast-paced environment.
Healthcare Business Intelligence & Analytics Analyst -Information Technology
Location:
620 Foster Avenue Brooklyn, NY 11230
Hours:
Full Time
Premium Health Center, a rapidly growing FQHC in Brooklyn, is seeking a detail-oriented and analytical Business Intelligence (BI) Analyst to join our growing Data & Analytics team. This role blends data analysis with light data engineering to build robust data pipelines, deliver actionable insights, and create high-quality reporting and analytics. The BI Analyst will play a key role in transforming raw data into actionable insights that will directly inform strategic, clinical, operational, and financial decisions across the organization.
Time Commitment:
Full Time, Hybrid Eligible
Responsibilities:
Analytics, Visualization & Storytelling
· Design, develop, and maintain dashboards, reports, and data visualizations in Power BI (or similar tools)
· Apply data visualization and storytelling best practices to create intuitive, user-friendly dashboards.
· Translate complex healthcare data into clear, actionable insights that support decisions for clinical, operational, finance, and executive teams.
· Develop and maintain semantic data models, KPIs, and performance metrics aligned with FQHC goals.
· Collaborate with stakeholders to gather requirements and recommend effective analytical and visual solutions.
· Analyze healthcare data from EHR systems (e.g.,eClinicalWorks, Office Practicum, etc) and other sources to identify trends, gaps, and opportunities for improvement.
· Support UDS (Uniform Data System) reporting and other regulatory compliance requirements.
· Create sustainable reporting frameworks for recurring healthcare and operational metrics.
Data Engineering &Pipeline Support
· Build and maintain light ETL and data integration tasks using SQL, APIs, and scripting tools.
· Write and optimize SQL queries to support analysis, dashboards, and data pipelines.
· Perform data wrangling, cleaning, validation, and transformation to prepare datasets for analysis and reporting.
· Ensure data integrity, accuracy, and security in all reporting and data engineering workflows.
· Perform data validation, reconciliation, and root-cause analysis for data quality issues.
Collaboration and Data Literacy
· Collaborate with clinical, operational, and executive teams to understand business needs and translate them into technical solutions.
· Provide training, documentation, and support to improve data literacy and promote appropriate self-service use of organizational dashboards.
· Collaborate with IT and data teams on architecture, governance, and data quality initiatives.
Requirements:
· Bachelor's degree in Data Science, Public Health, Health Informatics, Computer Science, ora related field.
· 4+years of experience in a BI, data analyst, or similar role, preferably in a healthcare or FQHC setting.
· Strong proficiency in SQL, including complex joins, window functions, and data transformations
· Hands-on experience with Power BI, or similar BI platform, including DAX, data modeling, and visualization design.
· Experience working with scripting languages (Python, R, etc) and APIs to support data integration and automation.
· Experience with semantic data modeling in Power BI.
· Strong analytical, critical thinking, and problem-solving skills.
· Excellent communication and data storytelling skills with the proven ability to present insights to non-technical audiences.
· Detail oriented with strong data troubleshooting and validation skills.
· Highly organized, with the ability to manage multiple tasks and deadlines.
· Self-starter who works independently and collaboratively.
· Ability to partner cross-functionally across clinical, operational, financial, IT, and data teams.
· Fast learner with adaptability to evolving tools and organizational needs.
· Strong commitment to high standards of data quality, accuracy, and confidentiality.
· Familiarity with HIPAA or other similar data privacy standards.
Preferred:
· Experience with Microsoft Azure, Fabric, Purview, or similar cloud platforms.
· Experience with Power Automate or similar tool for basic workflow automation.
· Familiarity with Git or similar version control tools.
· Experience with EHR systems (eCW, Office Practicum, etc,).
· Understanding of healthcare data, including clinical, operational, and financial metrics.
· Experience with UDS reporting or other healthcare regulatory or quality metrics.
Compensation:
$110,000 - $145,000, commensurate with experience
Benefits:
· Medical, Dental, Vision and Life coverage
· Paid Time Off and holidays
· Employee Assistance Program
· Flexible spending account
· Public Service Loan Forgiveness (PSLF), NHSC Loan Repayment Program
· 403(b) Retirement Plans with employer matching