Analyst – Data Engineering

Location
London (Central), London (Greater)
Salary
£42,000 per annum
Posted
04 Aug 2017
Closes
01 Sep 2017
Contract Type
Permanent
Hours
Full Time

Job title:                     Analyst – Data Engineering

Location:                    London           

Hours:                        Full-time

Wage:                         £42,000 per annum

Work Pattern:           Monday – Friday

Employer:                  The Boston Consulting Group UK LLP

Duration:                    Permanent

Date advert posted:    4 August 2017

Closing date:               1 September 2017

Job Description

BCG's Advanced Analytics group GAMMA delivers powerful analytics-based insights designed to help our clients tackle their most pressing business problems. We partner with BCG case teams and practices across the full analytics value chain: Framing new business challenges, building fact-bases, designing innovative analytics workflows, training colleagues in new methodologies and interpreting findings for our clients.  The GAMMA team is a global resource, working with clients in every BCG region and in every industry area.  It is a core member of a rapidly growing analytics enterprise at BCG – a constellation of teams focused on driving practical results for BCG clients by applying leading edge analytics approaches.

BCG has established GAMMA to support case teams in managing and realizing insight and opportunity by application of advanced analytics to our clients' problems. Data Engineers, Data Scientists, Scientific Modelers, Scientific Programmers, Data Visualization Specialists as well as Geospatial Analysts are the core of the GAMMA team.  They work closely with case teams to provide both consultation on analytics topics and hands-on support, focusing on data engineering, advanced analytics tools, modelling, visualization, and bespoke client-facing analytics deliverables.

The Analyst – Data Engineering will join a growing team with a global presence, and will work with both clients and case teams worldwide. Successful candidates shall possess analytical technical expertise in the field of data engineering, strong collaborative and execution capabilities, excellent communication skills, a practical and flexible mindset and high attention to detail and work quality.

Key responsibilities

The key responsibilities involve the following:

  • designing databases, developing extract-transfer-load (ETL) processes and facilitating customised (statistical) analyses for client work;
  • collaborating with case teams to identify and accurately assess client needs and gather business requirements;
  • delivering accurate, fully tested output and documentation that meets business requirements;
  • performing quality assurance according to standards and guidelines for client projects;
  • working on site or remote with case teams and clients;
  • working as an individual contributor directly with a case team or as team member in larger case team settings;
  • providing quick-response support to case teams on subject matter expertise;
  • delivering highest-quality work both within team and to case teams /clients;
  • training clients, consultants and team members on the use of data tools and methods;
  • performing internal team projects and duties as required; and
  • travelling depending on project requirements.

Essential qualifications, skills & experience

The successful candidate will have all of the following qualifications, skills and experience:

  • an advanced degree related to data engineering, machine learning, computer science, applied mathematics, data science or IT; 
  • 1 to 2+ years of experience focused on data engineering in a consulting firm;
  • previous experience data management in a cloud environment (AWS, Azure, Google); extracting, transferring, loading large datasets; set up & manage databases; software development & architecture; technical understanding of system capabilities and constraints;
  • working experience in a global organisation;
  • experience with collaboration tools & ticketing systems (e.g. JIRA, Confluence, GitHub); and
  • experience with continuous integration; and
  • fluent written and spoken English.

Key competencies

The successful candidate will have all of the following competencies

Professional capabilities

  • Problem solving, communication, interpersonal and teaming skills
  • ability to effectively handle difficult and stressful situations with poise, tact and patience;
  • ability to anticipate, identify, and solve critical problems;
  • excellent interpersonal and communication skills;
  • ability and willingness to give and receive honest, balanced feedback; and
  • ability to act as thought partner and expert with stakeholders at different levels.
  • Work management, organization and planning
  • Excellent organizational skills, attention to detail, efficient time management;
  • Proactive communication of issues, priorities and objectives; and
  • Ability to thrive in a dynamic, fast-paced, demanding environment.
  • Customer and business focus
  • Strong collaborative skills and ability to adjust approach to effectively interact with clients; and
  • Focus on excellent client service and close attention to client needs.
  • Values and ethics
  • Demonstrates competence and character that inspires trust;
  • Flexible, self-motivated, and proactive out-of-the-box and critical thinker; and
  • Able to respect all BCG and client information as personal and confidential

Technical expertise   

  • Experience in core big data engineering activities (one or more of the following):
  • Understanding of uses of different databases (key/value, NoSQL, relational, Graph);
  • Database architecture and optimization (sort keys, indexes, query optimization…);
  • Data loading and management tools (Redshift copy, SQL Workbench J, Oracle SQL Developer…);
  • Data cleansing;
  • Data modelling (variable transformation & summarization, algorithm development, ...); and
  • Linux command line.
  • Familiarity with a broad base of analytics tools (one or more of the following):
  • Data management: e.g. Spark,  SQL Server, Amazon Redshift, Amazon S3, PostgreSql, Hadoop/Hive, Alteryx, neo4j, Teradata, other relational databases; and
  • Programming and/or scripting experience (Python, R, Scala, Java, C#)

To apply: please upload your CV and a cover letter directly through the portal.  

Date posted: 4 August 2017

Closing date: 1 September 2017