JOB TITLE: Data Engineer
LOCATION: Remote
THE ROLE: As a member of the Data Engineering team, you will create tools, pipelines, and systems that enable the business to reliably operate at scale, gain mission critical insight, and power engaging data products for our customers. You will be building important, large-scale observability into problems that are front-and-center to the business. Along the way, you'll be championing a culture of data literacy and experimentation, enabling Splice to build the best product it possibly can to enable music creators, everywhere! If this sounds like exciting and fulfilling work to you, apply today!
WHAT YOU'LL DO:
- Build and maintain self-service tools and extensible datasets that enable our peers across the whole organization to get the insight they need.
- Own and operate the structure of our Data Warehouse, ensuring quality, durability, and reliable builds of our pipeline.
- Address scalability issues, automate manual workflows, and add confidence to our analytics by simplifying and modernizing our datasets.
- Ensure the quality of our data by writing tests, building observability into our pipelines, reviewing RFCs, and providing guidance in data modeling.
- Participate in a business hours only on-call rotation to ensure the uptime and quality of our systems.
- Creating and cultivating a culture of data literacy, experimentation, and data-driven decision making.
JOB REQUIREMENTS:
- 3+ years experience building scalable and durable software.
- Demonstrated proficiency with Python, SQL, and Unix fundamentals.
- Strong familiarity with OLAP and OLTP databases.
- Experience with data transformation frameworks, such as sqlmesh or dbt.
- Experience with business intelligence platforms or data visualization frameworks, such as Looker, Hashtable, or Observable.
- Strong debugging skills, especially with distributed systems.
- Experience building supporting Cloud Infrastructure with Google Cloud Platform (GCP) and Amazon Web Services (AWS).
- Clear and consistent communication in a distributed environment.
NICE TO HAVES:
- Experience building Infrastructure as Code (IaC) with Terraform.
- Demonstrated proficiency with observability tools like StatsD, Datadog, Cloudwatch, etc.
- Demonstrated proficiency with containers and container orchestration.