top of page
Exponential Logo.png

Operations Analyst

Time Zone
Department

Americas

Seniority Level

Operations

Employment Type

Entry

Full-time

Summary

Join Exponential Technology as an Operations Analyst supporting the technical operations team that keeps ETI’s data platform running for institutional investors around the clock. You will work directly alongside the Americas and APAC Operations Managers, handling data pipeline monitoring, QA validation, incident triage, and infrastructure tasks. This is an ideal entry point for someone early in their career who is highly organized, technically curious, and wants hands-on exposure to every layer of a production data platform. No years of experience required, but you must be sharp, disciplined, and already using AI tools to work faster than people expect.

Day to Day Responsibilities

  • Monitor data pipelines and automated workflows for failures, latency anomalies, and data quality issues.
  • Perform QA validation on datasets before client delivery — run checks, compare against expected schemas, flag discrepancies, and enforce zero-tolerance data quality standards.
  • Triage and respond to customer support tickets during Americas operating hours.
  • Assist with Kubernetes and Docker infrastructure tasks: deployments, container health checks, log analysis, and routine maintenance.
  • Create and maintain monitoring dashboards (Grafana, Prometheus) and ensure alerting rules are current and actionable.
  • Execute daily data onboarding procedures — verify that new and recurring datasets are updated accurately and on schedule.
  • Maintain and update operational runbooks, process documentation, and standard operating procedures.
  • Support cybersecurity compliance tasks for SOC 2 Type II and ISO 27001.
  • Manage cloud cost tagging and generate cost reports across provider accounts (GCP, AWS, Azure).

AI and Automation Responsibilities

  • Use Agentic AI platforms to automate routine monitoring, triage, and alerting tasks.
  • Leverage AI coding assistants (Claude Code, Windsurf, Cursor) to write and maintain scripts for data validation, pipeline health checks, and infrastructure automation.
  • Build AI-powered QA checks that catch data quality issues before manual review is required.
  • Use LLM-based tools for documentation generation, runbook updates, and knowledge base maintenance.
  • Identify repetitive operational tasks and propose AI-driven automation solutions.

Requirements

  • Bachelor’s degree (required) in Computer Science, Information Systems, Engineering, or a related technical discipline.
  • Foundational knowledge of Linux (Ubuntu) — comfortable with the command line, basic system administration, and shell scripting (Bash, Python).
  • Basic understanding of Docker and Kubernetes concepts.
  • Familiarity with monitoring tools (Grafana, Prometheus) or willingness to learn immediately.
  • Comfort with data: able to work with structured datasets, run validation queries, and spot anomalies. SQL fundamentals are a plus.
  • Basic networking knowledge: DNS, HTTP, firewalls, and cloud networking concepts.
  • Proficiency with Agentic AI platforms and AI coding assistants — if you are not already using AI tools daily, this role is not for you.

Cultural Fit - Who Trives at Exponential

  • High Executive Functioning Skills — Ability to plan, prioritize, organize, and execute operational tasks independently.
  • Excellent Communicator — Clear, precise verbal and written communication. Able to write crisp incident reports and escalation notes.
  • Team Player — Collaborative, ego-free contributor. Willing to pick up any task the operations team needs done.
  • Technical Curiosity — You want to understand how the platform works end-to-end, not just the slice you are assigned.
  • Composure Under Pressure — Maintains calm when pipelines fail, alerts fire, or clients need immediate answers.

Would Be Great If You Also Have Exposure To

  • Cloud operations experience with any of: GCP, AWS, Azure.

  • CI/CD pipelines and Git-based version control.

  • Data engineering concepts: ETL/ELT, data warehousing, columnar data formats.

  • Python scripting for automation and data validation.

  • Cybersecurity fundamentals or compliance frameworks (SOC 2, ISO 27001).

  • Previous internship or project experience in a technical operations, SRE, or DevOps environment.

  • Exposure to financial data, market data, or fintech platforms.

Compensation

  • Competitive base salary commensurate with experience.
  • Stock option grant — participate in ETI’s growth as a pre-Series A team member.
  • Performance-based cash bonuses.
  • Health, dental, and vision insurance.
  • 401(k) retirement plan.

About Exponential Technology

At Exponential, we empower enterprises to innovate more rapidly by leveraging on-prem and on-cloud data technology in combination with tightly integrated analytics and LLM toolsets. By integrating data into a high-performance and flexible data backplane, we help businesses of all sizes overcome data entropy and unlock their true analytical capacity as an organization. With over 25 years of experience in the institutional investment space, we leverage our knowledge to extract value from data with our State-of-the-Art Technology and Analytics stack.


We are an AI-native company. Every team member — from engineering to marketing to operations — uses Agentic AI and AI-powered tools daily. If you are excited about working at the frontier of AI-augmented productivity in financial data technology, we want to hear from you.

How to Apply

Send your resume and cover letter to hr@exponential-tech.ai with the job title in the email subject line.

bottom of page