Data Architect - £70-80k

Jason / Jason@nekolondon.con 

Are you looking for an opportunity to really challenge and develop what you already know about data? Want to work for a data-driven global agency that's partnering with some of the most exciting brands in the world including Google? Interested in working on a data hub that processes 5 terabytes of data a day? We may have something for you.

Our client blends data science, objective media and captivating experiences to build valuable connections between brands and consumers. They're seeking a Data Architect to join our Architecture team who has vast experience in building the data models. The role itself will focus on guiding a team of data developers to integrate external data sources into an Enterprise Data Model, and overseeing the exposure of that data to the business. You will help shaping the Model by cooperating with the Product, Analytics, and Business Intelligence teams.

Duties of the role include:

  • Reviews, evaluates, and designs data services, physical and logical data models and implements high quality data architectures.
  • Designs and develops database systems and data flows, together with their corresponding data models and high-level information architecture.
  • Identifies key data sources from target systems to meet project requirements, constructs data quality control solutions, provides data flow diagrams and implements conformant ETL pipelines designs to support high-quality reporting, analytics and BI.
  • Manages and reviews data, including daily data updates, review, and evaluation; query revision; data scrubbing; code modifications and construction; design and implementation of Data Dictionaries and Data Standards.
  • Develops strategies for data acquisition, privacy, retention/archiving, replication, warehousing and quality.
  • Establish, maintain and document database and data model standards.

The ideal applicant will have 5 years experience of conceptual, logical and physical data modeling, data lineage \ data flow mapping and data quality along with similar amounts of experience of data warehouse and multi-dimensional database design and development using formal methodologies.

Strong knowledge of at least one RDBMS, e.g. Postgres, Oracle, SQL Server, etc is required and expert SQL writing and tuning techniques. Experience with Big data environments, Hadoop, Hive, Spark, HBase goes without saying and the ability to communicate using diagrams (BPMN, ERD, sequence diagrams) is important.

Desirable skills include:

  • Knowledge of Presto and Looker as data exposure tools
  • Data Vault modeling
  • Lambda architecture
  • Linux system admin
  • Java development
  • Agile development methodologies

We're actively interviewing for this position so please do not hesitate to send through your CV to find out more.

Jack O'Shaughnessy