Overview
On Site
Depends on Experience
Full Time
Skills
Adaptability
Amazon Web Services
Analytical skill
Analytics
Attention to detail
Business intelligence
Cloud computing
Collaboration
Computer science
Dashboard
Data Analysis
Data Science
Data Visualization
Data integrity
Data management
Data manipulation
Data modeling
Data quality
Data validation
Data warehouse
Decision-making
Extract
transform
load
Git
Good Clinical Practice
Google Cloud Platform
IO
KPI
Microsoft Azure
MySQL
Performance tuning
PostgreSQL
Python
Real-time
Relational databases
Reporting
SQL
SaaS
Scalability
Software engineering
Transformation
Version control
Job Details
About the role
- Manage and optimize the Looker platform, integrated as a white-labeled solution within Harness.io s Software Engineering Insights product.
- Collaborate with engineering, product, and data science teams to design and implement BI, data visualization, and analytics solutions aligned with product and business goals.
- Design and maintain scalable data models and ETL processes that support efficient data ingestion and transformation for reporting purposes.
- Develop and washboard data models, SQL transformations, and LookML objects, ensuring data quality and alignment with reporting requirements.
- Create and optimize Looker dashboards to support in-depth analysis, business KPIs, and real-time reporting, focusing on scalability and performance.
- Implement data validation and quality checks to maintain high data integrity across the BI platform.
- Conduct exploratory data analysis and provide actionable insights to guide product enhancements, user engagement strategies, and key decision-making processes.
- Stay up-to-date on BI tools, best practices, and trends to continuously improve the efficiency and capabilities of the BI platform.
About you
- 3-6 years of experience in a BI Developer, BI Engineering, or Looker Admin role, ideally within a SaaS or tech environment.
- Proficiency in Looker, including dashboard creation, LookML, and data visualization, with a strong grasp of BI best practices and performance optimization.
- Background in computer science or a related technical field, with solid experience in SQL, data modeling, ETL processes, and handling large datasets.
- Experience in working with data warehousing and relational databases (e.g., PostgreSQL, MySQL), as well as integrating data from diverse sources.
- Familiarity with data pipeline tools and a willingness to contribute to ETL processes and data management.
- Ability to collaborate effectively with cross-functional teams, including engineering, data science, and product, to translate business needs into technical requirements.
- Strong analytical mindset, attention to detail, and adaptability to changing project priorities.
- Nice-to-have: Experience with cloud platforms (e.g., AWS, Google Cloud Platform, or Azure), Python for data manipulation, and familiarity with version control systems like Git.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.