Overview: 

Typically, generate insight and logic through our unique informal market data asset whilst establishing and implementing the analytical models required to enrich and automate our insights that improve our responsiveness and understanding of the data. To establish deep routed understanding in the data to drive strategy that fundamentally overhauls our decision-making processes based on logic and insight. Aligning our own data with external data sets that can enrich our understanding of the data is also required to draw correlations and market insights that are predicatable. Through this understanding, commodify the data into easy-to-understand reports and dashboards aligned to our clients needs.

Core Objectives & Key Responsibilities:​

Objectives of this role

  • Work with data to solve business problems, building and maintaining the infrastructure to answer questions and improve processes.
  • Help streamline our data science and analytics workflows by improving data delivery and quality to internal and external stakeholders.
  • Work closely with the data science and business intelligence teams to develop data models and pipelines for research, reporting, and machine learning.
  • Be an advocate for best practices and continued learning, constantly challenging the status quo and striving for excellence.

Responsibilities

  • Work closely with our development team, data analysts and BI analysts to help build and maintain data flows that support our reporting requirements.
  • Use agile software development processes to make iterative improvements to our back-end systems, particularly our reporting DB.
  • Model front-end and back-end data sources to help draw a more comprehensive picture of user flows throughout the system and enable powerful data analysis.
  • Build data pipelines that clean, transform, and aggregate data from disparate sources and deliver quality usable data to data analysts and BI analysts for reporting.
  • Develop models that can be used to make predictions and answer questions for the overall business.

Data Processing & Management

  • Gathering/Extracting data from the database for analysis.
  • Cleaning and preparation of data for analysis.
  • Quality controlling our data processes and tables.
  • Identify, analyse, and interpret market data for clients.
  • Producing accurate BI reports and Dashboards, within agreed upon timeframes, which are:
    • Data valuable
    • Understandable to both a technical and non-technical audience
    • Insightful and supports quality decision-making for internal and external stakeholders
  • Create project tracking reporting for all stakeholders.
  • Setting up processes and systems to make working with data more efficient.
  • Exploring and interpreting data to identify trends and opportunities for business improvement.
  • Identifying data shortcomings and alerts within campaigns to drive proactive responses.
  • Provide recommendations on campaign performance improvements, based on client data and industry standards.
  • Researching new ways to make use of data, to improve business performance.
  • Managing the team to ensure that quality deliverables are produced timeously.

Stakeholder Engagement

  • Responding timeously to data-related requests and queries and keeping track of all requests.
  • Collaborating with key stakeholders on all aspects of report creation, incl. deadlines, deliverables, edits, and recommendations.
  • Scoping out the required data models required to drive efficiency with stakeholder buy-in.
  • Attending meetings with internal and external stakeholders to ensure an understanding of the project data and requirements.
  • Presenting information and communicating findings generated from data to stakeholders, suited to a technical and non-technical audience.

Preferred Skills and Abilities:

  • Bachelor’s degree in computer science, information technology, engineering, or related analytical discipline required.
  • Three or more years of experience with Python, SQL, and data visualization/exploration tools (Power BI, Tableau etc).
  • Familiarity with common Python-based ETL tools such as PySpark or Apache Airflow.
  • Familiarity with Kimball and Inmon data warehousing approaches.
  • Familiarity with the AWS ecosystem, specifically Redshift, RDS and EC2.
  • Familiarity with PostgreSQL preferred.
  • Communication skills, especially for explaining technical concepts to non-technical business leaders.
  • Ability to work on a dynamic, results-oriented team that has concurrent projects and priorities.

Disclaimer: 

The above information on this role profile has been designed to indicate the general nature and level of work performed by employees within the specified role. It is not designed to contain or be interpreted as a comprehensive inventory of all duties, responsibilities, and qualifications required of employees assigned in this job.